Quills vs. Code: Writing in the Age of AI
- Holly Rhiannon

- Sep 7
- 5 min read

AI is reshaping the writing landscape in ways few could have imagined a decade ago. Tools that once seemed experimental are now common, and publishing houses are licensing entire backlists to train machine learning systems. For writers, the moment feels urgent and unsettling. Questions of process, ownership, authenticity, and the future of creative labour are no longer theoretical. They press at the centre of the work.
As we reflect on this moment, one question lingers: what is it about human creativity that machines cannot replicate?
Where We Are Now
Between 2018 and 2022, tools like GPT-2 and GPT-3 began to surface in public and professional spaces. At first, they seemed like novelties. By 2022, however, GPT-3 was producing fiction samples convincing enough to be published. Quietly, many writers began using these systems for query letters, résumés, and even sections of novels.
By 2023, the conversation could no longer be ignored. That same year, I co-founded The Stygian Society in response to a publishing industry increasingly shaped by algorithms rather than literary merit. In 2024, NaNoWriMo announced that AI would be permitted in its annual challenge. For many of us, this decision clashed with the spirit of what the event had once represented: the struggle and triumph of human creativity. All the Municipal Liaisons for the Montreal area resigned, including myself. That moment led directly to the founding of The Order of the Written Word, a challenge committed to intentional, author-driven storytelling.
The larger industry was changing too. HarperCollins and other publishers began licensing titles for AI training. Copyright battles erupted. In 2025, the debate reached a new height as NaNoWriMo itself shut down, citing financial struggles, public backlash, and community loss of trust. At the same time, courts began ruling in favour of tech companies using copyrighted works without permission for training datasets, though with warnings that the legal framework remains unsettled.
All of this is unfolding against a backdrop where writers’ incomes have dropped drastically over the past two decades, while demand for fast, cheap content has surged. In this climate, it’s hard to see AI as simply a tool. It is instead, a competitive force.
Drawing the Line
I am not opposed to AI as a whole when it comes to ethics (though I am concerned about its environmental impact). I use it in my marketing work, where speed and volume matter, and where clients increasingly expect these tools to be part of the process. I sometimes ask for a list of subject lines, not to use directly, but to react against. Sometimes, reading AI’s flawed drafts frustrates me in a competitive way, pushing me to write stronger work.
But I do not use AI for fiction.
Writing story scenes, building characters, crafting dialogue… these are acts of authorship, not text created solely to serve a functional purpose. They require lived experience, risk, and emotional truth. To outsource them would not just cross a personal boundary, but would also raise larger questions about consent. After all, many of these systems are trained on the work of authors who never agreed to participate.
For me, the difference comes down to this: AI can be a tool when used to organize thoughts, overcome inertia, or provide accessibility. It becomes a problem when it replaces the author’s role in creation, or when it relies on the unconsented labour of others.
A Philosophical Risk

The risks here are not just practical or legal. They are philosophical, too. Martin Heidegger wrote about technology as a way of “challenging-forth”; forcing the world into production, treating everything as resources to be extracted. In this mindset, even human beings become “standing-reserve,” stripped of depth and meaning.
AI writing risks doing the same to storytelling. Instead of stories as acts of revelation and dialogue, they risk being reduced to content pipelines. Outputs shaped by algorithms, optimized for consumption rather than meaning. This is where the danger lies: not in the existence of machines, but in losing our own ability to let stories emerge in their full, human weight.
Yet Heidegger also reminds us:
But where the danger is, grows the saving power also
By resisting the reduction of art to mere content, by holding space for what is slow, intentional, and deeply human, we reclaim storytelling as a form of presence rather than production.
The Lazy Producer
To put this in practical terms, I often think about what I call the “Lazy Producer.” This is not meant to insult, but rather describe of a way of creating that meets surface-level demand without risk, texture, or investment. The Lazy Producer makes things that look like writing, but lack its essential gravity. And algorithms reward this kind of work, feeding it back to us in endless cycles.
This is where gatekeeping, often treated as a dirty word, can play a role. When practised as care rather than exclusion, gatekeeping can protect spaces where thoughtful, challenging, and human-centred storytelling continues to thrive. It can help preserve the integrity of literature against the pull of convenience and replication.
What We Can Do
So, what can writers and readers do now?
Write with your own words. Resist the pressure to hand over your first draft to a machine.
Read widely, both recent and older works. Support authors directly when you can, or use libraries when you cannot.
Label your work if it is AI-free. Readers are starting to care.
Support other writers making the same choice. Share, review, and recommend their work.
Ask questions in your writing communities about how AI is being handled.
Build or join circles where human authorship is valued.
Keep a record of your own process. Drafts, notes, and screenshots are proof of your authorship.
These are small steps, but ones that are concrete and important. Each act of showing up with your full human presence counts as an answer.
A Clearing for Storytelling
We began with a question: When machines can write, what is it about the human hand that code cannot replicate?
The age of AI will continue to test us, but the pendulum does not swing in only one direction. When the noise of replication grows too loud, it is authenticity that will cut through. The voice that was not flattened. The meaning that was not manufactured.
This is not a battle between humans and machines. It is a question of meaning versus mechanism. And meaning, the kind born of human struggle and creativity, will always matter more.

Comments