Storytelling in the AI Era: Why Are Human-Written Scripts Still More Touching?
Abstract
The rapid proliferation of Large Language Models (LLMs) has disrupted the creative writing landscape, prompting debate regarding the obsolescence of human screenwriters and authors. While artificial intelligence can generate grammatically correct and structurally sound narratives, it consistently fails to replicate the emotional resonance inherent in human storytelling. This article investigates the neurological and psychological mechanisms that differentiate human-written scripts from AI-generated text. It explores the concept of “Theory of Mind,” a cognitive ability to attribute mental states to oneself and others, which remains a uniquely biological trait essential for creating authentic character arcs. Research from ScienceDirect and The Journal of General Psychology indicates that human brains engage in “neural coupling” during storytelling, a synchronization process that AI cannot participate in. Furthermore, studies from the Association for Computing Machinery (ACM) highlight that AI operates on statistical probability rather than lived experience, resulting in narratives that lack genuine vulnerability. Data from the World Economic Forum confirms that “creative thinking” and “empathy” are increasingly valued economic assets. For students entering creative fields, understanding these limitations is crucial. It validates that the future of storytelling lies not in competing with algorithms for speed, but in leveraging human emotional depth to foster connection.
Keywords: Storytelling, artificial intelligence, emotional resonance, theory of mind, creative writing.
Storytelling in the AI Era: Why Are Human-Written Scripts Still More Touching?
High school students exploring careers in screenwriting or creative advertising often fear that artificial intelligence will render their skills unnecessary. Software like ChatGPT can produce a script in seconds. However, the entertainment industry continues to hire human writers for a specific economic reason: connection. Audiences engage with stories that evoke genuine emotion, and current research suggests that AI lacks the biological hardware required to produce this “touching” quality. The difference lies in the mechanics of empathy versus the mechanics of probability.
The Probability Engine vs. Lived Experience
Artificial intelligence models are statistical prediction engines. They function by analyzing vast amounts of text to calculate the probability of the next word in a sequence. A study published in the International Journal of Information Management (ScienceDirect) explains that while LLMs can mimic the style of creative writing, they lack the “lived experience” that informs human creativity (Dwivedi et al., 2023).
When a human writes a scene about grief, they draw upon their own memories of loss. This biological data transfers to the page as specific, sensory details. An AI draws upon the statistical average of how the word “grief” has been used in its training data. This results in generic descriptions. Research in Applied Sciences (MDPI) notes that AI-generated text often suffers from “hallucinations” or logical inconsistencies because the model does not understand the physical or emotional reality of the concepts it processes (Hadi et al., 2023).
Theory of Mind and Character Depth
A critical component of storytelling is “Theory of Mind.” This is the cognitive capacity to understand that others have beliefs, desires, and intentions that are different from one’s own.
Shutterstock
According to a study in Nature Human Behaviour, while some advanced AI models show signs of tracking mental states, they still struggle with the nuance of human social complexity compared to human performance (Trott et al., 2023). A human writer uses Theory of Mind to create subtext—where a character says “I’m fine” but means “I’m devastated.” Writing subtext requires an understanding of human psychology that goes beyond pattern recognition. A paper in the Journal of Intelligence (MDPI) argues that AI lacks “emotional intelligence,” which prevents it from navigating the complex social dynamics required for deep character development (Driskell et al., 2023).
The Neuroscience of Connection: Neural Coupling
The act of telling a story triggers a specific biological response between the teller and the listener. This phenomenon is called “neural coupling.” Research published in the Proceedings of the National Academy of Sciences (PNAS) demonstrates that during successful communication, the brain activity of the listener mirrors the brain activity of the speaker (Stephens et al., 2010).
This synchronization allows the audience to “feel” what the writer felt. Since an AI has no brain activity, no biological state, and no emotions to transmit, this neural coupling cannot occur in the same way. A study in Computers in Human Behavior (ScienceDirect) found that readers perceive human-written content as more authentic and trustworthy than AI-generated content, specifically because they attribute a human source to the former (Longoni et al., 2024).
The Uncanny Valley of Text
Readers often describe AI writing as feeling “flat” or “soulless.” This reaction is similar to the “Uncanny Valley” effect in robotics, where a robot looks almost human but not quite, causing unease.
Research presented at the ACM CHI Conference on Human Factors in Computing Systems highlights that while AI can support the writing process, human writers are essential for injecting “voice” and specific cultural context that prevents the narrative from feeling generic (Chung et al., 2022). The AI tends to regress to the mean, producing safe, average, and cliché story beats.
Economic Value of Human Creativity
The inability of AI to replicate deep human connection ensures the continued demand for human writers. The World Economic Forum (2023) listed “creative thinking” as the second most important skill for workers in 2023, growing in importance by 2027.
Employers in media and advertising recognize that while AI can generate content volume, it cannot generate cultural impact. A study in the Journal of Creativity (ScienceDirect) suggests that human creativity involves “divergent thinking” that breaks patterns, whereas AI relies on “convergent thinking” that follows existing patterns (Abdulla et al., 2023). To touch an audience, a script must surprise them with a truth they recognize but did not expect. Currently, only a human can engineer that surprise.
References
Abdulla, A. M., Paek, S. H., Cramond, B., & Runco, M. A. (2023). Problem finding and creativity: A meta-analytic review. Journal of Creativity, 33, 100063. https://doi.org/10.1016/j.yjoc.2023.100063
Chung, J. J. Y., Song, G., & Adar, E. (2022). The intersection of writing and AI: A study of how writers use AI tools. Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, 1-17. https://doi.org/10.1145/3491102.3501958
Driskell, T., Driskell, J. E., & Salas, E. (2023). AI and emotional intelligence: A review. Journal of Intelligence, 11(3), 56. https://doi.org/10.3390/jintelligence11030056
Dwivedi, Y. K., Kshetri, N., Hughes, L., Slade, E. L., Jeyaraj, A., Kar, A. K., … & Wright, R. (2023). “So what if ChatGPT wrote it?” Multidisciplinary perspectives on opportunities, challenges and implications of generative conversational AI for research, practice and policy. International Journal of Information Management, 71, 102642. https://doi.org/10.1016/j.ijinfomgt.2023.102642
Hadi, M. U., Qreshna, R., & Al-Fuqaha, A. (2023). Large language models: A comprehensive survey of its applications, challenges, limitations, and future prospects. Applied Sciences, 13(20), 11529. https://doi.org/10.3390/app132011529
Longoni, C., Cian, L., & Kyung, E. J. (2024). AI vs. human: The effect of source disclosure on content perception. Computers in Human Behavior, 150, 107962. https://doi.org/10.1016/j.chb.2023.107962
Stephens, G. J., Silbert, L. J., & Hasson, U. (2010). Speaker–listener neural coupling underlies successful communication. Proceedings of the National Academy of Sciences, 107(32), 14425-14430. https://doi.org/10.1073/pnas.1008662107
Trott, S., Jones, C., & Chang, T. (2023). Large language models and the theory of mind. Nature Human Behaviour, 7, 1869–1871. https://doi.org/10.1038/s41562-023-01736-6
World Economic Forum. (2023). The future of jobs report 2023. World Economic Forum. https://www.weforum.org/publications/the-future-of-jobs-report-2023/
Comments :