Automation seems poised to take over a lot of industries. With the advent of artificial intelligence — more commonly referred to as “A.I.” — we’re already seeing monumental shifts in the world of media.
Ten years ago, “A.I. art” was pure science fiction. But today, you can go into any number of websites and type in an intricate, super-detailed request and in a matter of seconds, a photorealistic, virtually-created image pops up. The technology is getting more advanced every day. You can find scores of A.I.-generated songs on YouTube which replicate the voices of iconic singers to an eerie degree. And it’s only a matter of time until full-length feature films are produced through nothing but A.I. prompts.
Inevitably, legislators will have to step in at some point. As A.I. advances, serious questions about privacy rights are bound to arise. What’s stopping someone from using A.I. to create a video of you doing something illegal or obscene, or using A.I. to mimic your voice to say all sorts of horrible and offensive things?
Those capabilities aren’t far away. They’re already here. Indeed, anyone with an internet connection — no matter how young — can easily access software that digitally removes clothing from people.
Rest assured, a day will come when “that wasn’t me in the video, it was artificial intelligence” becomes an actual legal defense used in a real world court of law. It’s no longer a question of if, but how soon.
On a broader social level, the question posed by A.I. is a grave one. Are we destined to live in a culture where we cannot tell if any piece of media — any photo or video or audio or text — is authentic or simulated?
Recently, Sports Illustrated came under scrutiny for publishing several articles using A.I. programming. But it’s already a common practice. In fact, several big name newspapers have been using “bots” to produce high school football stories for several years now.
A lot of people in the world of journalism are terrified by that. And understandably so. But are we truly on track to seeing living, breathing reporters replaced by robotic algorithms?
While A.I. journalism does exist, if you read any article produced that way you probably walked away with the same opinion that I have.
Wow, is this stuff garbage.
Let’s go back to those artificially written high school football stories. The articles themselves are barely a few paragraphs long. And not only are they littered with grammatical errors and factual inaccuracies — i.e., erroneous final scores — seemingly half the time they don’t even get the names of the schools right.
Those articles don’t mention the names of any players, any comments from the coaches, summaries of scoring drives or even what day the game was played. Basically, all the program does is take numbers from a scoreboard, wrap a bunch of cliched platitudes around it and call it good.
Now imagine that same program “writing” about murder trials. Or mass shootings. Or tax exemption hearings. Talk about a recipe for disaster — if not a surefire libel suit or two.
A.I. programs are all engineered differently, but as the tech stands today the bots simply troll the internet for preexisting content on a specific topic. Then they basically cobble together bits and pieces of data and try to pass it off as original reporting.
Never mind that the programs lack the cognitive ability to decipher whether something on the internet is true or false. Or fact or opinion. Or knows how to properly cite sources. The fundamental flaw in the oxymoron that is “A.I. journalism” is apparent; unless the “news” is already written by somebody else and posted on the web, the programs don’t know about it.
And last I checked, there aren’t a whole lot of internet stories out there about local government meetings, local school functions, local court cases, local business openings and local sports written by anybody other than local newspapers.
Of course, technology makes incredible leaps. Our run of the mill household goods we don’t even give a second thought to are the kinds of things people 20 years ago literally could not fathom. Perhaps A.I. will get advanced enough to cover number-heavy worldwide events — like elections outcomes or NFL games — in the not too distant future.
But the one thing A.I., as “intelligent” as it may be, isn’t is inquisitive.
Let me know when A.I. is able to ask hard-hitting questions about municipal waste contracts and file open records requests pertaining to PFAS contaminations. Until then, I don’t think us flesh and blood journalists have much to worry about.
James Swift is the managing editor of the Dalton Daily Citizen.