Is AI About To Change How We Write…Again?

A group of Lego heads in a line, each denoting a different facial expression as a nod to AI developing its emotional intelligence.

Not too long ago, in a post about using AI to generate a case study, we wrote the following statement: “Generative AI is not quite there yet. In short, AI tools fail to write like a human. Which is exactly what we’d expect, because…well, AI isn’t human.”

To put it another way, we argued that AI’s lack of emotional intelligence limits its ability to effectively influence the behavior of readers. And that’s a vital ingredient in case studies, copywriting, and other pieces of content designed to motivate people to take action.

Not long after we put that post live, we came across a piece about how the rise of Emotion AI could potentially invalidate (at least some of) the conclusions we drew in it…which is sort of apropos for writing about AI right now, because things can change literally overnight!

Let’s dive into some of the implications of relevant research and look at other recent developments that might influence how writers can work with AI in 2024.

Can AI understand emotions now…?

The idea that the output from GenAI tools like ChatGPT and Bloom can be influenced by emotions isn’t wholly new. In July 2023, individuals associated with the Institute of Software (Chinese Academy of Sciences), Microsoft, William & Mary, and other institutions published findings relating to LLMs and emotions.

That piece of research, which has since been iterated on to v7, found a 10.9-percent average improvement in performance, truthfulness, and responsibility metrics when prompts were imbued with emotional stimuli. The study found, for example, that adding “this is very important to my career” resulted in “better” results.

Which begs the question: Is GenAI becoming capable of exhibiting empathy? Not exactly. Although LLMs appear to “understand” emotional stimuli, it’s still early days for all that.

In fact, the researchers involved close out their study with a non-committal conclusion:

“We are positive that more analysis and understanding can help to better understand the ‘magic’ behind the emotional intelligence of LLMs…However, the mystery behind such divergence is still unclear, and we leave it for future work to figure out the actual difference between human and LLMs’ emotional intelligence.”

…and can it influence emotions?

Whether we’re talking about laughably bad GenAI footage of Will Smith eating spaghetti or unsettling liminal horror scenes on TikTok, content generated by artificial intelligence is already capable of influencing our emotions. But it’s worth pointing out that written content generated by AI still seems to lag behind the visual on that front.

If we ask an AI tool to write a story that will make us cry, it could take several paths: it might write something sad, or something designed to make us laugh until the tears come, or it might even insult us until we break down…although hopefully that’s a worst case scenario.

Ask a qualified human writer to do the same and they could probably deduce what you’re looking for from the context of the conversation. Without extensive prompt engineering, AI doesn’t have that context. In order to get the output you actually want, you need to put a lot in. 

As we’ve written before, the output from AI is only as good as its ingredients

Although potential applications of Emotion-AI are beginning to emerge, the need for ethical development and regulatory challenges means that they’re progressing relatively slowly.

How can writers keep up with AI developments?

In recent months, we’ve seen tools like ChatGPT used for everything from creating articles and scientific papers to college admissions essays. The last of these articles says that AI-generated content can be “vague and trite” and struggles to be “specific and reflective.”

In other words, as we’ve stated before, the robots are here and I still have a job. The overarching sentiment continues to be that when it comes to writing like a human, AI is “not there yet.” We can continue to think of it as a useful assistant, rather than our replacement.

But if it feels like the GenAI space is developing rapidly, that’s because it is. LLMs are designed with rapid growth and “learning” in mind. Even though the examples linked above demonstrate mixed levels of success, it could be a very different story in just a few months.

We’ll continue to update on developments in this space on our blog, but our newsletter is a great way to get more up-to-date views on the intersection between GenAI and writing, along with tons of other useful content relating to all things word-y!

Art Anthony

Art is a freelance writer and journalist based in the UK who gave up the big city grind to live the country life. His current and past work includes Inverse, Costco, Fitocracy, Recess, and more. His areas of expertise include software/tech, popular culture, travel, and health/fitness. When he’s not writing, you’ll probably find him playing video games, watching American sports, or on a hike.

Reach Out

Wordsmithie, Inc. US: +1 (650) 209-0936
EU: +1 (650) 488-7700

© 2024 – Wordsmithie, Inc.

Be Social