Can AI Get Us All Talking Again?
The Books3 debacle, the WGA draft agreement, and an open letter to the algorithm
Yes, I know. We’re so tired of hearing about more technocrat shenanigans to add to the artificial intelligence (AI) dumpster fire. The fatigue is real.
It’s what Meta, EleutherAI, OpenAI, Google, and Bloomberg (yep, they’re into AI now) are counting on each time they leak and mishandle private data, break security for users, and violate individual freedoms.
It’s exhausting, but it’s also . . . galvanizing. You can’t scroll through your phone without coming across comments, reflections, and helpful public service announcements about AI. It’s brought together writers on social media, whether in forums, threads, spaces, or chats. For the Talking Writing team, at least, it fired us up enough to send out this newsletter today.
Could AI bring us together like the aliens did in 1996’s Independence Day?
Authors have definitely lost that shiny optimism that AI can be anything but a power grab for technocrats. After all, controlling and manufacturing information at light speed and scale means taking control of the world.
This summer, The Atlantic broke the story on the latest AI crime spree that hit authors of all stripes and genres: Big Tech’s systematic piracy of books used to train and power their AIs’ learning language models. It was called the Books3 dataset (subtle).
As of this writing, the database has been taken down, but not before authors searched and took screenshots of their names along with the list of their books scraped without permission. Reader, this is just one out of the countless datasets that AI corporations have by now.
For the sake of helping others get up to speed, the Talking Writing team put together a few helpful reading resources:
There are so many layers to this story that The Atlantic had to devote a series to show just how rampant it is.
Five days ago, the Authors Guild published an article that summarized information on what authors can do to fight back and protect themselves—no matter how basic—from further data scraping and piracy.
On the same day, it was reported that the tentative deal that broke the Writers Guild of America (WGA) strike included clear verbiage on the limits of AI. Studios can’t use it to write or rewrite literary material. However, the deal doesn’t include protection from studios training AI using writers’ work.
We wanted to balance this list with a response or statement from the AI corporate bosses. There aren’t any.
We can’t decide if this new level of blatantness is a good thing. It may make it easier for us to take action against their next crimes (there will be a next time). But does their silence mean arrogant stupidity? Or are they complacent, knowing that no one can (really) go after them?
So . . . What Can We Do?
As always, the essential work happens behind the scenes, largely unseen. And there’s something we can do about it.
Support the Authors Guild’s class-action lawsuit against OpenAI, Meta, and Google. They’re pursuing protections for all writers, so even if we aren’t mentioned individually as plaintiffs, a result in their favor will be a win for all writers. Donate here or join as a member.
Sign the open letter to CEOs of AI companies.
Subscribe to our Substack (button below) or donate so that we can keep talking and writing (and not just about AI, we promise).
Palate Cleanser
In the spirit of these if-you-don’t-laugh-you-cry moments, here are some recommended stacks to cap things off nicely:
John Vogel goes straight to the one who calls the shots in “Open Letter to the Algorithms.”
I’m writing to you because as I send out emails to writers, podcasters, editors, magazines, playlist creators, etc., I realize that you’re more influential than the human influencers anyway, so I should just talk to you. You decide what to play next after the last song has ended. You make the recommendations based on purchases and searches.
Why not appeal to you directly?
Where do we sign, John?
Martha Nichols throws a righteous tantrum in “When AIs Talk Like Humans.”
But right now, I need to rave on my own in written words—chaotically, emotionally, like a female human who resents my need to communicate getting turned into “help.” I’m feeling bullied, forced to accept something I don’t want, that I keep saying no to. I don’t want to carry on these simple tests, especially not with the thing—THING—speaking back to me.