Politics

Human This Christmas

Everyone in my professional life — fellow faculty members, other writers — is up in arms about ChatGPT, the new artificial intelligence tool that can write like a human being.

Tech is not supposed to be human. It is only ever supposed to be humanoid. But this chatbot can take multiple ideas and whip up a cogent paragraph. The professional classes are aghast.

Some of us professors are primarily obsessed with assessment and guarding the integrity of, well, everything. We scan essays into proprietary cheating detectors and tut-tut when a program finds a suspiciously high proportion of copied text. For at least 10 years, academics have fought about the proper role of rooting out computer-assisted cheating. Should we build better tests or scare students straight like a 1980s after-school special? We are split.

ChatGPT is so good that we aren’t sure if using it even constitutes cheating. The paragraphs it offers are original in that they aren’t copied from another text. It can even insert citations, protecting our academic culture of credit. Whether accurate or not, inserting references conforms to the style of academic writing. Nature asks if the technology should worry professors.

I would be worried, except my profession has been declared dead so many times that I’ve bought it a funeral dress. Humanities are not dead. Writing isn’t dead. And higher education will hobble along. You know why? For one, because this technology produces really creepy stuff.

A.I. writes prose the way horror movies play with dolls. Chucky, Megan, the original Frankenstein’s monster. The monster dolls appear human and can even tell stories. But they cannot make stories. Isn’t that why they are monsters? They can only reflect humanity’s vanities back at humans. They don’t make new people or chart new horizons or map new experiences. They are carbon copies of an echo of the human experience.

I read some of the impressive essays written with ChatGPT. They don’t make much of an argument. But neither do all writers, especially students. That’s not a tell. A ChatGPT essay is grammatically correct. Writers and students often aren’t. That’s the tell.

But even when the essays are a good synthesis of other essays, written by humans, they are not human. Frankly, they creep me out precisely because they are so competent and yet so very empty. ChatGPT impersonates sentiment with sophisticated word choice but still there’s no élan. The essay does not invoke curiosity or any other emotion. There is a voice, but it is mechanical. It does not incite, offend or seduce. That’s because real voice is more than grammatical patternmaking.

Voice, that elusive fingerprint of all textual communication, is a relationship between the reader, the world and the writer. ChatGPT can program a reader but only mimic a writer. And it certainly cannot channel the world between them.

I was in the grocery store this week. Everything is holiday music. I love the different genres of Christmas music. In my life, it isn’t the holiday season until the Temptations’ “Silent Night” spills from a public speaker. It isn’t good enough for me to cue up my own selection; I want other people playing it. I want to hear it in a store or spilling from a Christmas tree park or a car. That’s how I know the season still has meaning as a tradition that calls strangers into communion, if only for the few moments when we hum a few bars of “Silent Night” together in a grocery store aisle.

This store was playing a song by a group called Pentatonix. I looked it up to be sure. The song was musically sound, as far as I could tell. The notes were all in the right places. But it had been filtered in the way that mechanical Muzak covers transform actual songs into mere sounds: technical holiday music. And it didn’t call anyone into the season, I can tell you that.

That’s the promise of ChatGPT and other artificial approximations of human expression. The history of technology says that these things have a hype cycle: They promise; we fear; they catch hold; they under-deliver. We right-size them. We get back to the business of being human, which is machine-proof.

This is a great time to think about the line between human and machine, lived experience and simulation. There are 1,000 holiday traditions. All of them call us back into the space of being more human than machine. Less scheduled, more present. Less technical, and messier.

Humanities, arts and higher education could use a little reminder that we do human. That’s our business, when we do it well. We are as safe from ChatGPT as the Temptations are from Pentatonix.

What I Am Up To

I talked with Trevor Noah for his final week hosting “The Daily Show.” You can watch our conversation here. Trevor ended his seven-year tenure with an impassioned plea to broaden and deepen our culture’s pool of experts. I am smarter because I look for organic genius. Trevor and I share that value.

I recently talked with NPR’s “Pop Culture Happy Hour” about the modern western “Yellowstone.” There is a fifth season. You may be bingeing the series this holiday season. I don’t recommend doing it all in one sitting. The host Linda Holmes and I talked about watching “Yellowstone” like your parents once watched soap operas: in doses, and with a healthy sense of perspective on its latent politics.

What’s on My Mind

The Biden administration brought Brittney Griner home and signed the Respect for Marriage Act into law. There is always something to fight about, but these are indisputably good things. Thanks, President Biden.

If we are going to fight, let’s let it mean something. The spectacular explosion of FTX and Elon Musk’s heel turn at Twitter say it is high time we debate what I have called “scam culture.”

Tressie McMillan Cottom (@tressiemcphd) is an associate professor at the University of North Carolina at Chapel Hill School of Information and Library Science, the author of “Thick: And Other Essays” and a 2020 MacArthur fellow.

Back to top button