top of page

2. Spotting Fake Reporters with Futurism’s Maggie Harrison

May 14, 2024

One journalist, who grew up loving Sports Illustrated, makes a crushing discovery online. (23 min)

The Bot Beat Logo

Transcript

Maggie: So at that point we're like, okay, this is not a real person.

 

Jay: Right. And that- finding that website kind of opened a can of worms, right?

 

Maggie: Absolutely. Yeah. So then we started kind of piecing through the rest of that website. We figured, you know, if there's one, there might be more.

 

[Bot Beat music swells then settles]

 

Jay (Narration): This is Bot Beat, a new podcast that tracks how journalists are harnessing AI in their reporting. I’ll do quick chats with reporters and key you in on how exactly these emerging tools can affect your work. I’m your host, Jay Kemp. I’m an independent journalist reporting on AI within my field. In this second episode, I’ll be chatting with Maggie Harrison. She’s a journalist from Futurism, which is a tech-focused magazine that reports on fascinating innovations and how they’re going to change our lives. Obviously, AI qualifies. And this episode’s a little different from the last. Maggie’s reporting made waves, in the news and online, for uncovering a great example of what exactly can go wrong when you misuse AI in journalism – specifically, where AI-generated content has caused quite the scandal at Sports Illustrated.

 

Maggie: So we at Futurism- we as a collective, we're a very small team, but we've been following AI and media pretty intensely since early January 2023. We were the ones who, it was my colleague Frank, who broke the CNet story where CNet was using AI without, you know, explicitly denoting its use. There was a disclaimer, but it was hard to find. And so we've been pretty pretty intent and dedicated and following this be as kind of, you know, the first wave of AI in media has come to light and people have tried different things, sometimes telling people, sometimes not, not so much telling people. And I happened upon the Sports Illustrated stories, like I was a beat that we were covering. And then in late October, it was very striking… And so a lot of, you know, publications have these kind of reviews. So, you know, buying guides, breakdowns, best air fryers of 2024. And we try it on this and add that kind of content. And they're writers at reviewed, which is USA today's affiliate site. They came out with a very striking claim against their owning body, Gannett. So the union protecting the writers there. They took to Twitter and they said, we are publicly accusing our owner of hosting AI generated content bylines by fake writers to our websites, which is obviously- it's very striking, very wild.

 

Jay: It's honestly a shocking claim that something like that made it to the publishing ground. Right?

 

Maggie: Yeah. Well, exactly, exactly. It's like for something like that to be normalized within a, you know, respected media body. Gannett is one of the, you know, newspaper overlords of America, you know. It’s tied- it's basically between Gannett and McClatchy. They own a ton of local newspapers right across the country. And so I was pretty, you know, fascinated, slash horrified, you know, as somebody who's a reporter, as that news was breaking and then pretty quickly, you know, you know, following digital breadcrumbs, as I do all day, I tracked that over to Sports Illustrated, where I found the fake writers there. 

 

Jay: Totally. And what exactly did that look like? What did you find when you were going into these websites?

 

Maggie: Yeah, it was so there were all of these posts. It was a separate part of the Sports Illustrated website. And I want to be very clear that it wasn't it was a third party contractor who had provided Sports Illustrated with the content. It was, you know, there was a disclaimer that says that our editorial team was not part of this. So we essentially- I found a separate part of the Sports Illustrated website that was entirely dedicated to, you know, content like buying guides. And again, it's that based on Sports Illustrated, it was very sports centered and like workout supplements and different, you know, baseball bats or running shoes, yoga pants, consumer goods like that, that would sort of make sense on a Sports Illustrated site. And then at the bottom there are these faces with these bios that said- you know, the one that really caught wind was Drew Ortiz. Ortiz is and you know, he's an outdoorsman who grew up on his parents farm surrounded by nature and  loves the outdoors. And it was this whole backstory about Drew.

 

Jay: He sounds like a great guy.

 

Maggie: Yeah. You know, a really nice guy. He loves to fish. He loves to be outside. He's, you know, he just loves the farm.

 

Jay: Absolutely.

 

Maggie: We have strong reason to believe that, you know, said farm is fake. And we ended up- there was you know, I again, if I wasn't looking for it, I wouldn't notice. But kind of once you zoom in a bit on the faces, there is that very uncanny valley feeling to them. You know: realistic from far away, but you zoom in and it doesn't quite look human. And we were able to find his face for sale at a website called Generated Dot Photos. And you know, I have yet to meet somebody who has purchased their face in an online marketplace. So at that point we're like, okay, this is not a real person.

 

Jay: Right. And that finding that website kind of opened a can of worms, right?

 

Maggie: Absolutely. Yeah. So then we started kind of piecing through the rest of that website. We figured, you know, if there's one, there might be more. And, you know, from there we found not just, you know, these, you know, so-called writers, fake writers with, you know, names and these, like, very made up, but hyper detailed bios with, you know, their different expertise. And no one person is good at nutrition stuff and another person is good at like, batting gear, and another person is, you know, they're the yoga and Pilates person. So they all had these different like very specifically crafted, you know, very niche interests, you know, quote unquote interests, complete with these AI generated faces.

 

Jay: Absolutely. And what was your feeling when you came across these things, like your first reaction? Was it kind of like a doom scroll? Or how did you feel when you were looking through this material?

 

Maggie: Immediately it felt very unreal because- and I think that's really the chord. The chord that it really hit was, you know, my first thought was, no, this is Sports Illustrated.

 

Jay: This can't be.

 

Maggie: This can't be, this is Sports Illustrated, this is not- I'm not saying that I would expect this from any, you know, lesser publisher, because I would not like to see this in any publisher. I don't want to see this online anywhere. But the fact that it was, you know, this, like, respected American institution where, you know, past presidents have penned op-eds for this publication and to see these fake faces hawking these, you know, varieties of Amazon goods. And also it was also like very poorly written as well, like there's no world where I would call this, you know, like quality content that would actually be legitimately helpful, helpful for a consumer, let alone, you know, the fact that, like having a fake person with a made up bio is, you know, inherently misleading to a consumer… I mean, I was an athlete. I grew up when I was like, in middle school, I thought I was going to be on the cover of Sports Illustrated. You know, like, that's what it means in American society, to a degree, is like, it's, a publication that means something to people and you're excited about. And they have done incredible journalism for decades. And so that was really that was the first feeling. And then it was. Like anger isn't the word. It was just. Oh, that's not good. That's right. That's really not good.

 

Jay: Absolutely. And so following that kind of thought along, I mean, I would freak out if I came across this stuff because, again, this is the kind of publication that we all, you know, Revere and value within our American media ecosystem. What was the thread for then? You approached your team. We were going to write about this. We were going to break this story. What we're kind of the internal, like, deliberations that went on about whether or not to break the story, what that would look like, etc..

 

Maggie: Yeah, that's a good question. You know, I took it immediately to my editor, John Christian, and he's an incredible editor. And, you know, I was going back and forth with him a lot, and we were at first like, okay, let's just see how extensive this is. And, you know, so far it's it's like, how how widespread is it in this magazine? Is it just in this one section, as elsewhere? And it really I can't stress enough, this was not the work of like these staffers, a hard working journalists, like human writers at Sports Illustrated. This was provided by a third party company... Kind of, you know, we then just needed to know more. Like we didn't want to publish something that was just like, oh, look what we found. We wanted to really explore the company itself. So we did talk to some inside sources at that third party, contractor. It's called AdVon Commerce. So I talked to some insiders there. They alleged that AdVon does also use AI to write the content. AdVon has denied this, but this source says that at least some content provided to Sports Illustrated by this third party contractor was crafted using AI generators as well. So that was kind of how we we dug a bit deeper into the content itself, you know, where did it come from? What does this whole process look like? And then kind of once we felt like we do it, we've done our due diligence. We decided to publish.

 

Jay: Absolutely. And, you know, I'm going to kind of poke and prod for a second here. I think you've done a really good job of laying out all of these, you know, specifications like this was not Sports Illustrated. These were not Sports Illustrated writers. These were buying guides, maybe not necessarily articles in the traditional sense. So if that of all that is true, why did it have such a huge reaction and backlash within the journalism community and within different staffers at Sports Illustrated like we saw?

 

Maggie: Yeah. And that's, you know, I think one of the biggest I had was a thread that I saw a lot of, you know, but it's not news content, which is true and which is good. Like, I mean, I can't stress enough that it is very good that in the realm of what, you know, a publication could have been publishing ... .It's not like we need our buying guides to be written by people that weren't really, you know, the through line. It was one. There are writers who could write this and do it better like this. This just isn't good content in general. Like this is bad. And it's not up to the standards of a body like Sports Illustrated, but it's the idea that something like that felt subversive. And because, you know, there's no disclaimers, there's no disclosure to say, we made up this fake person. There's no disclosure to say that there was AI involved in any capacity. And I think the idea of like, there is something so inherently misleading to not just using a fake writer, but also leaving out disclaimers and disclosures like, there. It just felt like a lot of, there was just that really deep through line of just very misguided, if not backwards media ethics. And I think that's what really struck a chord in people.

 

Jay: I think I completely agree, I remember you saying something to me on the first time that we spoke, about sort of a consumer rights lens that you look at it through. Would you mind kind of expanding a little bit upon that, like what you meant by that and why that matters here?

 

Maggie: Yeah, absolutely. To me, I've found that that's the most useful lens to look at it through because again, you know, people have very different thoughts and feelings in a big way. The idea of, you know, AI and media and what AI means in, you know, as we're, you know, kind of grappling with it as an industry. Right now I am a person who's very motivated by media literacy. And I think that media literacy, especially in this very fractured media economy and media and information ecosystem that we live in right now, like in this very fractured world of like truth and reality, where truth and reality are very different to very different people right now. Consumer rights from the media, and AI sense is what do we as a media community owe to readers and to consumers of news?..How do we give the consumer every possible tool to make sense of a piece of information and how they want to engage with it, how they want to interpret it? How do they want to metabolize it? How do they best understand that piece of information? Bylines are a part of it. Like my name is on every piece of news or every blog that I write, every report that I write, my name is on it. So that like one is an accountability piece on my part, but two, it's so that like there's an understanding of where something is coming from. And if to me, when we think about AI in media, it can very easily go into this ideological space of should news and media ever be written by machines, should AI have a role? What should that role look like? Those are different questions, but if we look at it through the consumer rights lens, we can say, what do we owe readers to best make sense of a piece of information? And so to me, not only are, you know, including prominent AI disclaimers, a very important part of that right no w. I think if anybody's experimenting with AI, they should absolutely tell readers that they are doing so. And if they lose readers as a result, that's, you know, the consumer rights piece, it's kind of like having a nutrition label on a piece of food at the grocery store. Somebody can look at the ingredients and say, I don't really want to engage with this, or I'm going to engage with this in a different way. And so I think that it's just a due diligence piece to me.  

 

Jay: And, and so much of journalism is built on trust. We have all of these ethics for a reason so that we can create good journalism. Something like this really brings up the question why? Like why, why, why, why? …. It's clear that the motivation here is economic. Journalism is a struggling industry, and obviously different publications are trying to supplement their work to get more consumers, more clicks, more views. How do you think the whole SEO optimizability thing kind of fits in here? What was the goal, do you think, of Sports Illustrated in using this content, even if it came from an outside contractor?

 

Maggie: That's a very good question. And I think that's something that everybody has. You know, everybody who's been engaged in this story at all has really asked themselves because I do think I think you've, you know, hit the nail on the head where the media industry is struggling. And so I think a lot of people have a lot of different strong feelings about the SEO industry and what, you know, SEO like the hunt for SEO has, you know, done to the media. And again, like the media information economy.

 

Jay: And just to be clear, for everyone listening, we mean SEO as in search engine optimization. I probably should have said that.

 

Maggie: Yeah. So, you know, people who are using keywords and because like I say, it's important if you don't have good SEO, your, you know, your news will never get read. Everything else that you know, you're running on your platform will never get read. So I do think that, and a lot of people have written a lot of, you know, in the wake of, you know, the AI scandal, Sports Illustrated, a lot of people have written some very fascinating and informative pieces about, you know, the changes that Sports Illustrated has gone through over the past, I would say like five ish years. Sports Illustrated. These were articles penned for SEO. And in large part, the efforts that I've seen in the high reports have seemed like Band-Aids for bullet wounds, where this is not something I don't think anybody in the media industry is turning to AI and saying, this is going to save the industry. I think that people are turning to it or are using and incorporating it, or incorporating it as a means to what we can do to, you know, like bail out some of the water, essentially. And yeah, a very dark portrait of, you know. A media industry that's incentivized a lot of bad content and you know otherwise content that's not up to the rest of, you know, asides, editorial standards, I guess is a better way to put it.

 

Jay: Absolutely. I think that's kind of the big concern right now. And a lot of people are turning to AI and seeing only horror stories like this one. I think these are important to talk about. We need these kind of red flags for what we shouldn't and can't do with these new tools. But ultimately, these tools are going to make their way into our ecosystem, and they're going to continue to do so. So where do you see some element of promise for AI and journalism? I've been talking to a couple of different innovators and such, and I think there's a lot out there, but I want to hear your honest take. Pro AI, Anti AI... Where do you see the promise here?

 

Maggie: So if there is one place that I use AI regularly in my workflow, it is transcription. It's incredible. I love using AI to transcribe interviews. It makes my life a lot easier and I look at it the way that I think, like if I, if I look at why transcription is so, so it's like not only does it cut down so many hours of my time and every everything that I transcribe, you know, I go back and I make sure that's accurate, but that's much easier than listening and writing three words and then listening and then writing three words. And it's also not replacing a role that I like. I would not have had somebody following me around and like being my, you know, typist as I conducted interviews either way. So it doesn't replace a job around me. It really just does genuinely make my life a lot easier and cut down a ton of man hours. But right now, for me personally, that's the only place that I have in my workflow. I…. Using, for example, like AI as a search engine. I generally advise against that. I haven't found it to take time- I haven't found that to cut down on my hours. I would just personally rather I find it more worth my time to just do my own research the way I would do it otherwise. As any good reporter does with Google. But like, I just found it like doing research in, like, a more like traditional sense…But speaking from my own experience, I just haven't found much, very much use for it. For me personally versus again, transcription. It's incredible.

 

Jay: Absolutely. And as we've seen when it comes to text generation, it's kind of the weakest link, which is how we even got here in the first place with this story.

 

Maggie: Absolutely. That's a thing where, you know, and there are still so many things that it can't do from a reporter's angle or like, I can't. Like ChatGPT could maybe write, but like one that's like the part of my job that I love. So I wouldn't want to give that up to a machine anyway. I love to write. That's why, you know a big reason why I do what I do. It's a central part of why I do what I do, because I want to actively write, and I'll do that for as long as the machines let me. But then there's also, you know, a machine can't go and talk to a source for you. Like, there's so much that, like, a robot cannot do right now….But yeah, as it stands, you know, the actual news of it all and, you know, if we're going to go on an ideological fault lines, I personally I still like to I like to write the news and I like my news is to be written by humans right now.

 

Jay: And I completely understand that. And, you know, kind of just to put a whole pen in it. I always like to end on a question that's more pre figurative in nature. That's kind of ideating the world we want to see. Right. So earlier you had mentioned, talking about media literacy and the importance of that and the ways that a lot of people at this moment don't have the tools to be able to distinguish AI generated from non AI generated content and stuff. So my question to you is what responsibility do journalists have when it comes to creating or promoting media literacy? And what do you hope to see moving forward as AI is further incorporated into journalism?

​

Maggie: Yeah. Like, again, the advice of just do a good job and do your job. It sounds like super annoying advice, but I think that really is the heart of it on the reporter's side. And I think as a public in general, I think that we've been slow to adapt to this, you know, fragmented media environment. It's not helping that, you know, platforms like Facebook, meta have turned away from news and don't want to prioritize news anymore. You see now Facebook is flooded with AI and AI generated misinformation and bizarre AI images like your random animal share. And you just look at it and you wonder, how could you possibly think that that's real?... And so I think that, you know, as a public again, like media literacy moving forward, like for democracy, not to grandstand, but I think it is like the most fundamental, foundational thing that we can do as a public to prepare us for the future and what the future of news and information, which is, you know, one of the bedrocks of our society. Yeah. So now, again, if I had it my way, we would teach AI or not. If I had it my way, we would teach media literacy in school and we would make that a central part of curriculums. I think we have always done that to a degree, but I would love to see it adapt more for, you know, modern day. And yeah, I think that I obviously am only getting better and is in a world where it stalls out. Maybe. Probably not. Is there a world where, you know, the AI just all starts eating synthetic content, and then they all just stop working because they're so.

 

Jay: Caught in their own feedback loops.

 

Maggie: They're so yeah, they're on their feedback loops and everything just turns to dust essentially. I don't know, maybe. It probably isn't though. So, you know, as it stands, I hate to make predictions. But we can only expect it to get better. And so our job as a public to continue to educate and then again, especially within it, there's always going to be bad actors are always going to be people or not even bad actors and people who just, like, make something that they think is cool and then they share it to social media. And like in some cases, again, it's like a dog with a baby's face. But in some cases, as you know, the BBC just did a very fascinating report. Where- And horrifying, where, you know, MAGA talking heads are, generating photos of Donald Trump standing with, you know, groups of black women that aren't real. And they're used to promote. I think that one was used to promote a Bible. But people are commenting on it and saying, God bless. You know, and they have no idea that it's not real. So I do think, you know, the media, we need to continue to do the best that we can do with what we have. If you're using I always I know that as such, in general, if something is not real, we need to say that it's not real. And. Yeah, I think that. The broader collective of society has, you know, we all need to try to adapt as well.

 

Jay: But yeah, and for your average journalist, there's nothing wrong with experimentation. But experimentation needs to be grounded in science. It needs to be grounded in ethics, journalistic ethics, transparency, kind of. All of these principles that we've built the field of journalism around need to still be there, even as people are figuring out how to incorporate these new tools.

 

Maggie: Yeah, absolutely. And again, it's, you know, it's just our job. So we just need to do our jobs.  …

 

Jay: Newsflash to all the journalists out there: we have to keep doing our jobs.

 

Maggie: Yeah. I know. Sometimes I wake up and I don’t want to, but… And I always want to hold like other- especially continuing to report on AI, I think solidarity with writers, especially when these writers came out specifically like the second or report came out, you know, the same day they came out with a very strong statement from the union that completely decried the content. They said that it was not something they wanted to be associated with. And so I think that, you know, as we report on AI in media and its use in media and what that looks like, especially in a situation like Sports Illustrated or CNet,  really standing in like firm, strong solidarity with writers is, you know, continued to be like very essential, important for, you know, people in the media industry.

 

Jay: Absolutely. I really appreciate that. That last note at the end, I think that's something that we need to keep in mind, is like the very human also, reporters who are being impacted by these decisions from higher up as well. It's a whole web of connections.

 

Maggie: Exactly.

 

Jay: Yeah. Well, I just wanted to say thank you so much for your time today. I really appreciate it. And hearing you talk about this story, I'm sure we're going to be following along as features and reports on this stuff more.

 

Maggie: Yeah. Thank you for having me. This is super fun.

 

[Bot Beat music swells, then settles]

 

Jay (Narration): It’s so easy for cases like this to tap into the ongoing fear in journalism about AI, and how it’s going to impact our work. But I think Maggie spoke so eloquently about our duty as reporters to use AI in an ethical way – to lean into its strengths while calling out misguided or unethical usage when we see it. For our next episode, we’ll head to the Philippines, where a tense political & news environment has created a clear need for one new AI tool.

 

Jaemark Tordecilla: And so it really has a real impact in terms of not just journalism, but also civil society and how things are run in government. And so I knew that…

 

Jay (Narration): I’m your host, Jay Kemp, and this is Bot Beat. Stay tuned and stay with me as we keep learning together.

 

[Bot Beat music swells, then fades]

bottom of page