The rapid integration of artificial intelligence into every sector of society has created a surreal state of journalism in 2023. In its early days, everyone feels in the dark and stumbles upon AI-generated content. A multitude of points of sale, this site included, have tried AI-generated content. Conversely, large sites have injected code which blocks OpenAI GPTBot crawler to analyze the content of their sites. Simply put, the debate over AI-generated content is only just beginning.
However, Sports Illustrated, which spent decades building its reputation on reporting, long-form journalism and 70 years as an industry leader, took liberties with artificial intelligence that went well beyond current media standards. By trying to avoid the aforementioned debate, they burned their own reputation.
Forty years ago, venerable journalist George Plimpton wrote the infamous Sports Illustrated article April Fool’s Day tells the story of fictional Mets prospect Sidd Finch. Legendary throwing feats. Now imagine if SI’s current management, The Arena Group, went to great lengths to hide that Plimpton was not a living, breathing human being and that the story they published was written by an artificial intelligence in constant learning and training on intellectual property. produced by organic beings?
Well, that’s an approximation of what The Arena Group did in conjuring signatures of fugazi writers such as “Drew Ortiz” and “Sora Tanaka”. For months, they passed off AI stories as content written by editors with made-up bios, and swapped them with other fictitious editors with made-up bios to avoid detection. Their biographies, according to Futurism, reads like the kind of generic, carefree idiots that the AI probably imagines humans to be. Ortiz’s bio describes himself as an outdoorsman, “excited to guide you through his endless list of the best products to keep you from falling into the perils of nature.”
Meanwhile, Tanaka “has always been a fitness guru and loves trying different foods and drinks.” The Arena Group also did the same with TheStreet bylines, posting expert writers who were not only fictional but also dispensing bad personal finance advice. I’m surprised they didn’t find the time to dig up screenshots of C-Span Mina Kimes discusses rail monopolies to gain the trust of their readers. This whole operation was the AI content generation analogue of Steve Buscemi’s undercover cop infiltrating the high school meme. “How are you, dear humans??”
Much like Sidd Finch, it turns out that Ortiz and Tanaka are fictional identities fabricated by Arena Group to create the illusion of a flesh-and-blood writing team. As part of its efforts, The Arena Group purchased images for their fiction writers on an AI portrait marketplace, which is worrying in itself. I don’t know what the legal precedent is for AI portraits that closely resemble public figures, but Luka Doncic should definitely call his lawyers because prominent botwriter Drew Ortiz looks a lot like the Mavs forward.
AI-generated content is already pretty unpopular, but it’s not really unethical. However, this certainly shouldn’t be done behind a veil or a second-rate Luka. If driverless vehicle technology advanced to the point where companies began to compete with human taxi or Uber drivers, passengers would want the choice of who they ride with and who they support. AI-generated content is the untested media driverless car that cruises these Google-run streets. The Arena Group is like a reckless transportation company trying to fool its readers into thinking their driver is human. It sounds stranger than fiction, but that’s the times we live in.
But it went beyond zany professional execution. Once the template is in place and Futurism requested comment, The Arena Group launched a cartoonish and misleading cover-up by attempting to remove most of the content generated by its fiction writers.
The entire industry is still trying to navigate this cutting-edge terrain, but the signatures still denote credibility – or lack of credibility. How are readers supposed to discern what is what and trust the Fourth Estate if the media supports the idea of misleading its readers about the provenance of its content? People want to know if they’re reading Albert Breer or an amalgam of Internet voices designed to sound like him. All The Arena Group has done is create distrust among its readers by engaging in dishonest practices. It can’t do anything good. Especially at a time when the industry is facing uncertainty and attacks from external influences.
Monday evening, Variety reported that The Arena group had ended its partnership with Advon Commerce, the third-party vendor that provided the branded content. But who knows how far it would have gone without human reporting? SI Swimsuit Issue cover models generated by AI? On second thought, maybe I shouldn’t give them any ideas since future AI-generated editors might analyze this for ideas.
Follow DJ Dunson on X: @brain sport