Major Creative Organizations Take Firm Stand Against Generative AI in Art and Writing
By admin | Jan 25, 2026 | 3 min read
In recent months, prominent figures and organizations within science fiction and popular culture have adopted stronger positions opposing generative AI. Separate actions by San Diego Comic-Con and the Science Fiction and Fantasy Writers Association (SFWA) highlight the significant resistance to AI within certain creative circles—though they are far from alone, as platforms like the music distributor Bandcamp have also recently prohibited generative AI content.
Last December, SFWA announced updates to the rules for its prestigious Nebula Awards. Initially, works composed entirely by large language models were declared ineligible, while authors who used LLMs "at any point during the writing process" were required to disclose that use. This allowed award voters to decide individually whether such usage influenced their support. As reported by Jason Sanford in his Genre Grapevine newsletter, this adjustment sparked immediate criticism for potentially legitimizing partially AI-generated work. Within days, SFWA’s Board of Directors issued an apology, stating, “Our approach and wording was wrong and we apologize for the distress and distrust we caused.”
The guidelines were subsequently revised once more. The current policy now explicitly states that works “written, either wholly or partially, by generative large language model (LLM) tools are not eligible” for the Nebula Awards, and any submission found to have used LLMs during creation will be disqualified. In a follow-up, Sanford expressed satisfaction that SFWA had listened to its members, and he reaffirmed his personal refusal to use generative AI in his fiction writing—“not only because of this theft but also because the tools are not actually creative and defeat the entire point of storytelling.” However, he also emphasized that important questions remain about how broadly LLM usage will be interpreted, particularly as “these generative AI products are being forced down everyone’s throats by major corporations.”
“If you use any online search engines or computer products these days, it’s likely you’re using something powered by or connected with an LLM,” Sanford noted. “Because of that, we must be careful that writers who use word processing and research tools with LLM components aren’t unfairly disqualified from awards like the Nebulas or attacked by readers and other writers.”
Similarly, San Diego Comic-Con encountered controversy this month when artists discovered that the event’s art show initially permitted AI-generated art to be displayed—though not sold. Following complaints from artists, the rules were quietly updated to state, “Material created by artificial intelligence (AI) either partially or wholly, is not allowed in the art show.”
While Comic-Con’s response was less public than SFWA’s, some artists shared email replies from art show director Glen Wooten. He explained that the earlier rules had been in place for “a few years” and had served as an effective deterrent, as no AI-generated art had ever been submitted. “But the issue is becoming more of a problem, so more strident language is necessary: NO. Plain and simple,” Wooten was quoted as saying. It is likely that other organizations will announce similarly strict policies in the coming year, as these communities continue to grapple with the broader implications of generative AI.
Comments
Please log in to leave a comment.
No comments yet. Be the first to comment!