Like many of my colleagues, I’ve added a statement to my syllabi about the use of AI tools to write papers as a form of plagiarism. I’m hoping that the recognition tools available will help me detect whether a student has used AI to write their paper. But even more than that, I’m hoping that my students won’t go there in the first place, because I’ve already given them an impassioned plea about how important writing is to the learning process.
Over in the indie author community, we’re having very different conversations, because we’ve already seen AI tools being used in a bad-faith way that is—and I am not exaggerating here—threatening our livelihood.
Some writers use AI tools in specific ways already. Sudowrite, a popular tool among some of my peers, is basically a “help you brainstorm/word vomit” tool. It is not, in my understanding, capable of spitting out a whole story or novel; it’s something that writers use to supplement their existing process. They would still curate and edit the result, keeping their writing a mostly human-created activity.
Plenty of writers (and students!) use Grammarly, too, which helps check grammar and provide alternative suggestions. However, they are apparently starting to develop their own take on AI tools, which is something to keep an eye on.
What’s happening now with ChatGPT is on a completely different level, as demonstrated in this Tweet from well-known speculative fiction magazine Clarkesworld:

Clarkesworld isn’t the only magazine being overwhelmed with ChatGPT submissions. I’ve seen reports on Twitter of others shutting down their submissions until they can figure out how not to get spammed by people (whom I shall not deign to call writers) trying to game the system by mass-producing AI texts.
Some magazines are choosing to run with it, I guess, like Metastellar, which is accepting AI submissions. I’m curious to see whether they’ll be overrun with more drek than they can reasonably read.
But wait, there’s more.
As reported on Reuters, Amazon Kindle is now allowing ChatGPT-authored books to be sold on its site. There is no current consensus on whether or how to label AI-authored books as such, or how readers will respond, and so on.
It may not be the absolute end of the world for indie authors, since we can pivot quickly in our strategies, but it’ll make it harder for us to get our books in front of readers’ eyes if there is SO much more quickly-produced material flooding the market. I’m not sure how trad authors and publishers will deal with this. As with academia, trad publishing tends to move slowly, even glacially, so I don’t know whether they’re already having these conversations.
More than anything, this is disheartening because of what it says about humanity and creativity. We have tools to automate menial tasks, and we’re using them to rob creatives of their careers? Massive bummer, my friends. I’ve heard of real estate agents using AI to quickly generate the kinds of descriptions that are easy but boring to write. Is that fine? Probably. But deciding to use AI to write massive amounts quickly in a deliberate attempt to make a quick buck, or overwhelm a magazine (whether out of spite or an attempt to win the prestige of being published by breaking the whole system), or… it just makes me sad.
I don’t plan to use ChatGPT or related forms of AI. I know that AI is already in our daily lives in mild forms as predictive text and so on, but it’s a logical fallacy to equate these small programs with ChatGPT and the other AI programs attempting to replicate the human creative process with large amounts of text output. I have some privacy concerns, and I don’t want to feed it my words and my thoughts for free, when OpenAI may well just turn around and charge even more for the product that I would help it train. No thanks. For all I know it’s already been trained on my writing without my consent, which is also an unhappy thought, but not an actionable one.
At the end of the day, ChatGPT is a tool. Tools are ethically neutral: I use my knives for cooking, not for stabbing. But when the tool has been developed in a Silicon Valley context, and released in a society where both plagiarism (in my teaching setting) and flooding the market (in my writing setting) are seen as quick ways to get ahead? I’m going to be leery of the tools.
(Shout-out: my New Mythos writers community has given me a lot of insight and ideas on how to think about these issues, and some of the links above came from discussions they’re hosting.)