(This essay originally appeared on AdatoSystems.com )
Over the last few weeks, there's been a lot of chatter about ChatGPT, a writing tool built in OpenAI. So much noise, in fact, that everyone from CNN, the NYT, Forbes, the Atlantic, the Washington Post, the Guardian, BBC, TechCrunch, CNet, and approximately a half billion techbruhs on YouTube had to sound off on it. All in the last two weeks.
The opinions range from incredulous to breathless to skeptical - albeit carefully so. Nobody really knows what the next few weeks will bring, and therefore nobody is willing to declare ChatGPT entirely one thing or another.
My my part, I think it's important remember that computer-based writing existed long before this moment. According to one of my friends who was there at the time, "Microsoft fully replaced all of it's journalists for MSN with AI in 2020." Furthermore, automated journalism has been in play since at least 2017.
A difference (perhaps THE difference) with what's happening now is that - while earlier iterations have been... limited (and that's being both kind and generous) - ChatGPT is decidedly not-awful. But that's not the same as "good". It's not even clear if ChatGPT is "good enough" because nobody can quite agree on WHAT the output would be good enough FOR.
In the last few days alone, I've seen folks claim ChatGPT means the end of student essays up to and including at the college level. I've seen an analysis of the biases accidentally introduced into ChatGPT due to the algorithmic training model used to set it up. And most memorably (for me at least) I've seen people finding out the limits of what ChatGPT can know, including what is (and isn't) kosher.
With all of that said, I work in a sector of I.T. broadly considered technical content marketing - which I define like this:
Creating content that speaks with a personal, human, and specific voice to the audience in a way that creates a connection to a problem, journey, or experience and thereby earns permission to talk about a feature, product, or capability."
Therefore I'm choosing to to stay in my lane and reflect on what ChatGPT means to me and folks who do similar work as I do.
In short, I think (again, in the context of my work) that ChatGPT is another form of a content farm and not much more. Or less.
Content farms are nothing new. They've been around since at least 2010. Whether they are effective or not depends on what one is trying to achieve. In some respects, ChatGPT (and all the systems like it) share the same benefits and drawbacks as content farms do.
ChatGPT opens up a possibility which was imagined but not truly viable until now. I'd like you to imagine an automated pipeline:
- First, query for high-ranking SEO phrases in a given domain, industry, or broad subject area.
- Next, feed the results into an AI system to generate content based on those SEO phrases.
- Finally, use the AI-generated the content output to build new webpages.
The result would be the ability to instantly and inexpensively identify the questions people are seeking on the internet, and drive traffic to your site by creating tailored webpages in near-realtime.
The problem with this is that the pages will be mostly crap. The content you get will be at the level of "robotic college students". Maybe that's all you want it to be, but you have to accept that's the (current) limit.
But the hard truth is that some folks will accept that limit, and build the SEO-to-webpage machine. The fact that this is almost surely going to happen (and soon) raises a few important questions. The first, recently addressed by Josh Bernoff on his "Without Bullshit" blog, is what this means for writers of technical content:
What this means is that ordinary, workaday writing is no longer a writer’s work. What makes writing worth doing?
Original insights. AI doesn’t have those.
Engaging prose. AI is weak on that.
Wit. AI still lacks that.For writers, the job is now to do a better job than a machine. You can also succeed by using the machine to generate prose and improving it. But you can’t just write ordinary text anymore.
But the second is more essential for content creators within companies. You see, if we technical content writers push back hard by pointing out the poor quality of this new content machine, companies will ask us to bridge the quality gap by inserting humans into the pipeline. Most likely that will take the form of taking the ChatGPT output and then having people "fix" (which will often mean significantly re-writing) it; or simply having humans take the place of ChatGPT entirely and attempt to churn out as much SEO-driven and SEO-selected content as fast as (humanly) possible.
Either of those outcomes raises an essential question: What do companies want staff to spend time doing?
In the not-so-distant past (and still happening today), some time was spent generating content that satisfied relevant SEO terms (albeit with a slower pipeline than the one ChatGPT will drive). This content didn't have to be amazing, it just had to answer the question. Some companies go so far as to pay SEO companies to generate lists of relevant search terms, and use that as a source list for upcoming articles.
The main issue I and others in my line of work have with any of these SEO-to-webpage strategies (whether driven by artificial or good old human intelligence) is it prioritizes speed and quantity over quality, leading readers to associate the brand with fast, cheap, crappy writing - which never leads to engagement, credibility, or trust.
ChatGPT is simply shining a brighter light on the core failing of this mindset: it devalues and dehumanizes the writers themselves. Before, we could complain that "you could hire a monkey to write this" but it wasn't literally true. Now it is.
I'd suggest that the lessons we learned in starting the first forays into content farms in 2010 still hold true:
- Most content farms are not useful or beneficial to search engine users.
- The content is shallow.
- The content is largely listed as "anonymous", reducing credit and credibility.
If "Without Bullshit's"' insight is true - that the advent of AI-driven content creation means human writers need to strive harder to create with original insights, engaging prose, and wit - then it's equally true that businesses need to both challenge and enable their content creators to produce work that is insightful, engaging, and witty.
That means being thoughtful about the topic, how that topic is structured for the reader, and the format used to present it. It means providing both time and resources to create this type of content. It means trusting the creators themselves to know what will engage audiences.
It means - as it always has - that content doesn't exist for it's own sake, or to simply satisfy a search question. It exists to build a human connection between the person asking the question and the person taking time out of their day to answer.
Top comments (0)