Last week, we published the story of “Victoria Goldiee,” a freelancer with a promising pitch and bylines at a series of impressive outlets who wasn’t what she seemed. After we dug into her published work—finding fabricated quotations from real people, possibly created using AI—four separate publications, including The Guardian and Dwell, took down her stories.
In the days since, I’ve heard from editors around the world who’ve had their own run-ins with “Goldiee.” The editor of a small Indigenous-led newsroom on the West Coast emailed, explaining she had just assigned the freelancer a story earlier that week. An editor in the States had been in the middle of editing a draft Goldiee had submitted about people squatting on public land, apparently drawing on the author’s own experience living in a van for years, when she “decided to Google Goldiee’s name one more time” and found my article. (You can read that account here). Countless more reached out to say they’d been pitched by her—for stories in The Globe and Mail and a local paper in Tulsa, niche publications on everything from military service to Grand Prix racing, a brand new books publication in Oakland, and the 150-year-old scientific journal Nature. Truly, the range was breathtaking.
But the story here is not just of a single prolific pitcher who appears to be using AI to try to dupe the editors of the world. The bigger question in my inbox, and in a conversation that has grown across the wider journalism world, is: How widespread is this kind of thing? And, more pressingly, in a universe where AI text can be produced with the click of a button and a pitch is no longer intrinsically connected to the human who sends it, what do publications need to do to protect themselves and their readers?
We’ve been having these discussions at The Local as well, and you can see the results in our AI policy, which we are announcing today. The core of the policy is pretty simple. The Local’s stories are researched, written and edited by human beings; our photographs are taken by human beings; our illustrations are drawn by human beings.
After we published our piece last week, The Walrus’s editor-in-chief Carmine Starnino got me on Zoom to talk. The conversation, first published at The Walrus, touches on modern fabulists, the role of AI in the newsroom, and what all this means for journalism.
Carmine Starnino: Two years ago, Sports Illustrated published a batch of fictitious stories by non-existent writers. All of it was AI-generated—even the author portraits. They swiftly took the content down. It isn’t hard to find similar examples of generative fakery. So, in one sense, there’s nothing fundamentally new in what you reported. The increasingly synthetic nature of journalism is something we have to deal with and worry about all the time. But your story feels different. Why do you think the reaction to it has been so dramatic?
Nicholas Hune-Brown: That’s a good question. The example of the Sports Illustrated stories is, in my mind, pretty reprehensible. But in that case, the shell of a once great magazine decided to use AI to create stories. This is more a situation where you see publications being duped by bad actors who, aided by technology, are sneaking in stuff that appears totally made up. And if you’re an editor, that’s a scarier prospect. Because you’re not choosing to do that. You’re getting tricked. And I think that element of deception makes it different.
CS: And why? To be clear, what makes it scarier?
NHB: Whenever I’ve gone into journalism classes for the past 15 years, I’m always saying that an amazing pitch can be this key that opens doors you never thought could be opened. That’s how I got my career, just writing good pitches. Someone doesn’t have to know you, but they can see your ability on the page. Now, as an editor, I open my inbox and I understand there doesn’t need to be any link between the pitch and the person. They can be completely separate things. And I don’t know how you work in that kind of world.
CS: That brings me to the next part: why do it? Sure, journalism has always had its fabricators. Stephen Glass, who you cite, being an obvious example. But in those cases, the deceptions were the act of an otherwise gifted writer working in deranged fealty to a very traditional idea of reporting. They lied, yes, but they came up with the lies themselves! It was still work! But to hand over the entire commissioning cycle to AI—from pitch to draft—feels like a win with no victory. Why go through the effort of making no effort?
NHB: This is what I was obsessing about for months. I don’t have any clear answers. At certain points, I thought Goldiee was one name out of many bylines that a content farm was using the same way they might be writing other email scams. I don’t think that’s the case anymore. I think it’s a real individual. I can speculate about psychology or whatever, but one thing to remember is what this technology enables in a global world of wild inequalities. AI can make gaps in geography or cultural understanding disappear. Based on what this individual has written, and some elements in their social media before it disappeared, I believe they’re either from or still live in Nigeria. And if that’s the case, a terrible word rate at a publication in the United States could actually be a pretty decent payday—especially if you just have to enter it into ChatGPT.
CS: Fascinating. That almost makes it less a story about deception and more one about economic conditions.
NHB: I think this is something we’re going to see across different industries. After I published the story, I heard from a social scientist who said they encountered the same thing when trying to do qualitative studies. This is where you put out a call and offer a small honorarium, right? And in the last year, they’ve looked at the IP addresses from participants when they come online, and it seems many are in Nigeria. I don’t know why Nigeria exactly. But these people were using AI to give generic responses, and then, in the Zoom call, turning off the camera and, between long pauses, presumably ChatGPTing answers to the researcher’s questions. So, she was kind of at a loss about how to keep doing the qualitative studies that they usually do.
CS: Some of the places that published her were not fly-by-night places. As you hint, maybe they’re under pressure to feed an endless demand for content, maybe the editorial teams are leaner and just don’t have the resources to vet stuff. Or the staff is being crowded into revenue-generating activities that make it harder to do the one thing that might have helped: edit properly. What jolted me about your piece, to be honest, is that it’s more than a judgment on the Victoria Goldiees of our time. It’s also a judgment on an industry that found her writing credible enough to let through.
NHB: Like I wrote, this is only possible in an incredibly degraded media environment. Some of the pieces that went up, of course, weren’t fact checked, but could not even have been read with a critical eye.
CS: Let’s flip this around. Does it really matter if there’s no human behind the text? Does it fundamentally change the meaning of what’s there? Me, I like to think that there’s a type of irreducible value created by the knowledge that someone wrote something—that it wasn’t a machine. But is that just sentimental? Do you feel that it’s vital the texts we publish are written by actual people?
NHB: I see no value in a piece of writing that, structurally, looks like it’s done by a human but does not actually convey any human experience. It has no value to me, even if I might not be able to spot it at this point, unfortunately. That piece in The Guardian that I mentioned in my story, about experiencing underground music and moving through all these spaces in England. It was very vividly written and moving in a way that stirred some commenters. But it didn’t express anything real. I mean, I’m not sure Goldiee ever lived in England.
CS: But you were impressed by it.
NHB: It was a good ChatGPT piece. It was impressive, and I could see why anyone would be fooled by it. I could see why they would enjoy it. But it has no value to me if it’s not created by a person. When I think about most of the work we do at The Local, I can’t see any way a computer can do it—phone someone up, talk to them, discover new things. That has value. You’re asking an interesting philosophical question. I’m just not into any of it right now.
CS: My editor-in-chief role at The Walrus has overlapped almost perfectly with the ChatGPT era. And so, I’ve been obsessing about it a fair bit. I find it’s shifting how I think about pitches, moving away from questions like, “Is this interesting enough to report out?” and towards, “Is there a mind at work here that readers want to spend time with?” We’re inching toward more seasoned writers—writers who can turn a phrase, and have vocal print. And we know that because they’ve published other things. They have a voice and have built a career around that voice.
Chatbots can replicate the skin of their sentences, but can’t incarnate the substance. That is probably too weird for what we’re hoping for in this conversation!
NHB: No, that’s interesting. I probably thought that way a year ago. I thought there was something innately human that a chatbot can never synthetically recreate. And I don’t think so anymore. I don’t know if there’s a way anymore to see a piece of writing and recognize the human soul behind its structure. Maybe there are still some stilted ChatGPTisms that give it away—people always flag em dashes, which I take such offence to as someone that uses a lot of them! But in a few months even those tells might be gone. So, my instinct is almost the opposite. I think the thing that cannot be replaced at all is the reporting. I think I value the person who has sat through events, who has picked up the phone and spoken to real people, sometimes people no one else has spoken to. I’m thinking that is the differentiating quality.
CS: Maybe we need a marriage of both? Writers who write in a way that’s inherent to them and who can couch that writing in reporting that’s also original.
NHB: The other thing you said that struck me, and is kind of tragic, is that the easiest way to tell that someone’s legit and that the pitch is good is their body of work. If they’ve written for you, and you know them, that is a safe way to do things. Part of what we want to do at The Local, part of what I’m always trying to do, is develop new writers. I know that’s what you folks try to do as well. That’s literally in our mandate. If I’m trying to assign seven stories, I want two of them to be to new and emerging writers who maybe haven’t done this before, but that you trust that you can work with them, train them up. I don’t know how that trust can happen now. I don’t know how young freelancers get their foot in the door in this new world. We’re not going to give up on it and just work with people we know, but it’s definitely something we have to figure out.
CS: Is this our equivalent of doping in sports—something we have to stay vigilant against?
NHB: Honestly, I think you guys are okay. I don’t think the Victoria stories that got published on other websites would get published on The Walrus or The Local. I think you have robust editing and have a great fact-checking process. I’m not worried about that. I’m more worried about the front end being inundated with garbage. How do you find what’s good, right? I doubt those publications that got burned are going to hire back their copy desk and fact checkers, but that is the most obvious move. That is, if you actually care about this stuff. If you don’t care—which I think is the case with some of these publications who are increasingly in the content business and don’t mind where it comes from—then that’s a different story.
CS: But that’s exactly it. I do wonder if more of the industry is becoming tolerant of this kind of synthetic content, especially if they’re already doing away with the quality-control measures. My worry isn’t just the front end. Look at what AI has done to search. Readers are already being trained to see summaries as being perfectly adequate proxies for journalism itself.
NHB: And you can see the numbers already, right? Places now see, say, 70 percent less traffic.
CS: Yes. My worry is that it could have an effect on newsrooms—encourage them to empty their room of news. I worry that operations whose sense of journalism is maybe not as acute as ours, not as ethically rooted as ours, are starting to feel that if, well, readers don’t care, maybe it becomes an opportunity to make things a lot easier on themselves. What if The Walrus and The Local end up among pockets of rearguard thinking in an industry that will hand over a lot of its content to large language models. I mean, if you’re running a media org, couldn’t you just fire everybody and chatbot your way into clicks? Ten articles a day, 20 an hour, 30 every 15minutes—in different voices, on every subject under the sun, conjured instantaneously and all for the price of a subscription to Claude or ChatGPT. Isn’t that a reality we might be heading towards? What if Sports Illustrated is our future and not a blunder? If there isn’t a desk somewhere deep in Condé Nast mulling this, I’d be surprised.
NHB: That would be a way to kill what remains of the journalism industry. You can’t give an inch to it. You can’t have a mix of things that are true and not true in your news publication or the whole thing is destroyed. There’s no space for that at all. That’s where I’m at right now. And I think, economically, it doesn’t make any sense. It might in the short term, but as search gets degraded, as this fake stuff gets proliferated across the internet, there is going to be real value in being able to go to a place and know the stuff there is true and human. I hope, and trust, it will be something readers find valuable.
Victoria’s online writer’s portfolio came down following her conversation with The Local.