Corporations are excited to stop paying writers and designers and artists and actors and models and musicians and videographers — even developers. They can’t wait to make movies and games and TV shows with as few employees as possible. They are salivating over their profit margins when they can eliminate their “overhead” of employees.
Individuals are excited to create ‘free’ ‘art’ without investing time or effort into developing a skill or style. Their ideas deserve to exist, and they’ll use whatever tools allow that.
Both corporations and generative AI enthusiasts feel entitled to use others’ work without permission or pay, for their own profit. They can’t afford or don’t want to pay for art or professional writing, but they’ve found a technical way to take it anyway.
This is rooted in devaluing creative labor and wanting to mechanize production: corporations perceive creativity as a quantifiable output that they can reproduce on demand with these new tools. They cannot fathom there’s something humans contribute that they can’t reproduce through technology. To them, creativity can be distilled to data. Hard, clear, ownable.
Creative endeavors are less formulaic than many other types of products — there’s no recipe guaranteed to make a blockbuster game or movie — so using AI makes it feel like corporate is in control of the process. It feels lower risk to lean on average outcomes from AI than hope for greatness from your creative team. Relying on AI cuts out human personality and opinions and relationships, which can slow down the process of production, never mind humans’ physical needs and limitations. With AI, there is no creative disagreement, just manufacturing the product. It does what you tell it to, nothing more or less.
Even without AI, that profit-optimized, risk-averse perspective on creative work has turned culture boring and flat. It turns out that you still need taste to decide what’s worth producing and marketing — a perspective? talent? skill? that will be in even higher demand when execs are wallowing in a quagmire of material and need to decide which ideas to invest their money in actually making. Creators know that ideas are the easy part: the execution is what matters. Can AI pull off that execution consistently and emotively to create cultural works that resonate and sell?
The dream is that they can use others’ stolen words and paintings and illustrations to create intellectual property they can make money off of without having to pay anyone else for it.
The dream is that corporations will take full control of cultural capital without cultural creations stagnating in the absence of future training data, or that they’ll be able to keep stealing others’ intellectual property as training data forever…and that the creative industry won’t collapse without clients and commissions so there will be future work available to steal.
The dream is that *they* can stop paying anyone to work for *them*, as will every other company that can get away with it, but that enough people will still have enough money to spend on their creations despite shutting down entire industries or eliminating skilled labor so workers can be paid much less and are easily replaced.
The dream is that they can flood the market with endless generated works and people won’t get fed up with drowning in oceans of mediocre, inaccurate content and switch to smaller, human-centered networks where they can get trusted information from other people.
The dream is that in the end, quantity matters more than quality.
The dream is that shortcuts work.
13 replies on “The dream of AI is the dream of free labor”
Liked WGA strike 2023: Hollywood’s writers walked off the job. What happens now? by Alissa Wilkinson (Vox)
Teamsters don’t fuck around, no way they’re crossing a picket line.
WGA On Strike – union website
The proposed terms and counters (via)
Per that doc, here’s what writers are asking for on AI:
That preventing their work from becoming training data is key, and is exactly why the studios counter-offered “annual meetings to discuss advancements in technology” — they absolutely do not want to give up the ability to train the writers’ AI replacements.
But the writers can’t give up, because at stake is the entire profession of screenwriting as a viable career instead of gig work revising AI output:
The writers got screwed on the switch to streaming. They are not going to make the mistake of underestimating the risk of technology again.
See also:
The dream of AI is the dream of free labor
Mining intellectual value
How can our economy shift to better support people and the planet? Last updated 2023 October 27 | More of my big questions Sub-questions How can we make work better? How can workers be empowered within capitalism? How do new technologies impact workers? What are the problems with how business operates and workers are treated…
Hollywood executives have detached cultural works from cultural meaning, losing sight of the anchor of their business. They’re currently chasing the enshittification cycle down, down, down, dreaming that AI will allow them to cut all their costs (people) while pocketing even more profit because they’ll be able to produce endless “good enough” content.
Ed Zitron writes:
As Tim Carmody highlights, studios are barely entertainment companies anymore as they move into streaming, with the entertainment they make merely the hook for their real profit-centers. They make culture, but they value culture only insofar as it makes them money. The end game they envision is generating content for next to nothing; with an endless supply of content, everyone will find something good enough to watch, letting them maintain a vast customer base.
Towards that future, studios are self-cannibalizing their own industry by destroying career development for writers. They don’t value storytelling or recognize script-writing as a craft needing industry knowledge. As Dave Karpf writes, studios will satisfice their processes and products using AI if they can get away with it, accepting mediocre scripts as the price of profitability.
But.
Generative AI entertainment is a bet on algorithm-driven culture over people-driven
A sea of content that expansive prevents people from finding content for themselves, so it’s reliant on serving algorithmic recommendations specifically for each viewer. So far that’s worked well enough for TikTok every other company in the entertainment / ad space has been falling over themselves trying to copy them.
Except people currently decide what to read or watch in large part through word of mouth — referrals from other people they trust. People want other people, not an algorithm, to suggest what music they listen to. Even recs directly from strangers carry weight because we have shared context through whatever place we’re interacting — whether a Discord channel, a subreddit, or on social media — and humans’ impressive abilities to draw conclusions about what someone may or may not like from their twenty-word bio.
Gambling on good enough is a risky game for the studios, because people don’t recommend things they think are crap. And, speaking broadly, people won’t bother to watch things no one else watches. Social bonding and signaling that we are part of a group are big influences on what we choose to consume.
Most people on social platforms chiefly follow celebrities, which I didn’t understand until someone pointed out that interest in celebrities is a facet of our species’ social patterning with other humans and reflecting how they act. I often choose what to read based on Goodreads reviews, and as soon as I’m done reading, I’ll head to Goodreads to check out the spoiler posts and see if others shared my feelings about certain parts. Interacting with other people, and learning how others interacted with a piece of culture, is an important part of the cultural experience.
Culture is person-to-person, not personalized
Right now, zeitgeist shows and movies become cultural events. After watching a(n early season) Game of Thrones episode, we would pull up Reddit and see what everyone else thought about it; I hear the it show is Succession these days.
Word of mouth requires a minimum audience to take off — one person is not enough for buzz. If viewership is distributed between a broader and broader pool of content, there’s less chance enough people will watch something. TikTok employees juice views of certain content, influencing some works to go viral — a human hand behind the algorithm.
Personalized content can’t become culture when identity is derived from shared culture
One future people keep envisioning is tailor-made entertainment. The concept of personalized content, custom created for you, is not a market for culture. (Porn, maybe.) Culture extends beyond the media itself, to the way society interacts with it. Franchises have gotten people to buy into expressing their identity through the culture they consume; books too have become a signal of identity, part of a personal style. People can only do that if others understand what they’re communicating through their selection of cultural references.
As Claire Dederer, quoted on tor.com, points out:
It’s not enough for people to consume only; humans apply meaning, and act on it. Consumption has become part of culture because it’s visible and gains shared meaning.
The studios are gambling that an algorithmic approach of recommending personalized drek will beat out the way humans have chosen what to engage with culturally for as long as it’s been possible, because they don’t care about the culture they are producing. They think that personalized recommendations will beat out tastemakers and human relationships. But that’s been the downfall of social media companies lately: ignoring relationships (Facebook and Instagram burying our friends’ posts) and cultural trendsetters (driving away the clever, funny, interesting people on Twitter and instead boosting posts by boring, unfunny Musk fanboys).
Mashup culture relies on shared meaning
We live in a time where culture converges, where everything is mashup, where shared sources and meanings are at the core of culture. Everything must appeal to everyone when success requires eternal audience growth.
Zitron highlights this gap in executive thinking:
Sure, people will sit around and watch TikTok for hours, but they’re sending their favs to their friends. They’re making reaction videos to other people’s videos, they’re participating in goofy challenges, they’re making Duets with strangers, they’re leaving comments. This is an interactive platform, where TV is not. What makes TV and movies social is watching them and talking about them with others — and there needs to be something there to talk about.
Book publishers are trying to ride BookTok and Bookstagram’s human recommendations to sales, but this uses the algorithm to amplify mass media — it won’t work for unique content. It only works for media that lots of people can watch.
YouTube, too, is different, because the format fosters parasocial relationships between viewers and the creators they follow. Without independent creators, flooded with generic auto-generated material, YouTube would get pretty boring pretty quick. The humans behind a video are essential to the engagement.
Cliche and franchise could probably carry AI generated works for a while, but in the long term, mashups are only engaging when they add something new, interesting, or clever — and that takes a human hand. AI mashups are entertaining people at the moment, but they were prompted by people. We’re going to quickly use up lowest common denominator mashups; copycats of “Balanciaga x Harry Potter” are already boring. Mashup culture moves fast, and if you want to compete in that space you need up to date training for your AI.
Generative AI is cool and fun because everyone is talking about it
Right now, generative AI is fun and exciting, so the bar is low for appreciation. Soon, that will change. Tastemakers will quickly move on from generated content without personality and heart. People will stop sharing their generated content on social media if no one engages with it; already, the screencaps on social media jump from product to product without loyalty, following the hip thing everyone is talking about. DALL-E graphics gave way to Stable Diffusion graphics (iirc), which gave way to Chat-GPT screenshots. A big part of the fun of generative AI is seeing how other people use it; the output is less interesting on its own, without the prompt attached.
Likewise, personalized content will be fun (if we actually get there) — but its longevity will depend on the extent that people can share it with others and use it to signal identity. If studios follow the path towards personalized generated content, their works will become the Buzzfeed Quiz version of entertainment.
More doesn’t always mean more
A pivot to AI relies on the foundational assumption that more content will let TV and movies win the attention battle with other forms of entertainment. Instead, flooding the marketplace with content will drive people towards word-of-mouth recommendations faster. Studios have burned a lot of trust with audiences already by cancelling shows and movies for tax write-offs or to avoid paying residuals; it’s a lot to then ask viewers to trust that shows written by generative AI will be worth their time. Right now viewers can’t even trust a show they liked will get a second season.
Studios will presumably still buy visibility for selected works, increasing the odds enough people will give it a try for it to take off, but that too is dependent on human selection: which shows or movies are worth granting an ad budget.
In the end, because studios still need audiences to “buy” their cultural products by watching them or subscribing to their platform, there is a limit to how bad those cultural products can be. Technology loves to “disrupt” things, but the oral, shared, trust-based foundations of culture are a tough sell. Only time will tell if generative AI really is good enough.
Liked AI machines aren’t ‘hallucinating’. But their makers are by Naomi Klein (The Guardian)
“This is effectively the greatest art heist in history.” — open letter co-authored by Molly Crabapple
“This whole “this is how humans learn so whats the difference” thing while stealing so much data to make billions for a few dudes is so insidious.” — Timnit Gebru @timnitGebru@dair-community.social
See also: Link pairing: AI trained on stolen art
See also: On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜
On the “hallucination” that AI will solve climate change better than humans can:
We know what we need to do about climate change — we lack the will to do it.
It’s not an information deficit problem. We don’t need more ideas, we need to implement the things we know will work. Corporations just don’t like that answer; Don’t Look Up was painfully on the nose. But pretending that we’ll magic our way out of more emissions with technology that hasn’t been invented yet is an excuse not to change now (*whisper* plus AI needs a lot of resources too).
Funny how “disruption” is often code for “provide the same service under market value, exploiting uncompetitive business practices with the goal of creating a monopoly” 🤔 Netflix and the streaming industry are pulling the same stunt.
See also: Who does AI work for?
The dream of AI is the dream of free labor
UBI is a society-level failsafe for its people
(I realized recently that I don’t talk about Universal Basic Income (UBI) enough because I mentioned it to my mom the other day and she’d never heard of it 🥺 So, if you haven’t encountered the idea of UBI before, I encourage you to read a bit about it!
It is American society’s choice to allow children to go hungry to punish their parents, and to drive children into labor as soon as possible, but we could change our minds. Personally, I’m less worried about freeloaders than the kids who currently don’t have food and people who’ve lost a place to live because we make it so hard to qualify for assistance and offer all-or-nothing help that keeps people in poverty.)
Replied to AI statement by Neil Clarke (neil-clarke.com)
👏
Another thing I’m scared of happening is that EULAs will essentially require consent for the material you create with a product to be used as training data. I’m particularly concerned about Microsoft as creator of the largest office software suite and heavy investor in very expensive to operate generative AI — to justify a ten billion dollar spend, they’re going to want everyone using it all the time. I’m scared that future personal versions of Word (probably not enterprise, to protect corporations) will require agreement that anything you write in it will be used to train its AI tools — that they’ll bill it as necessary data so they can provide users with ‘quality generative tools.’
Microsoft has lost a vast amount of respect from me lately, and not just with my husband getting laid off — the way they approached the layoffs was poor and the way they’re treating the remaining staff is disrespectful; they’re throwing fortunes into generative AI and investing in fucking fusion while telling staff they won’t get raises and bonus budgets are down, and breaking their promises by continuing layoffs past the date they’d given staff. Not enough people left the company on their own, so they’re trying to drive away more: no severance to pay that way 🤷♀️ And if employees stay because there’s a glut of tech talent right now thanks to the widespread industry layoffs, well, that’s just market conditions — how could poor poor Microsoft possibly afford to offer their employees raises with inflation like this? 🥾🤑
By going all-in on generative AI, basically Microsoft is telling me:
They want to fire as many of their own staff as possible, as soon as possible
They want to make it easier for other companies using their software to fire as many staff as possible
They dgasf about diversity, inclusion, or anything of that nature given the bias baked into generative AI through the current maximum vacuum then filter out the most blatant racism and bias approach
They also dgaf about the environment or climate change because AI is a massive water hog and energy suck (hence the fusion hail mary) — and the bottom is falling out of the shitty carbon offsets industry
See also:
Wage stagnation vs corporate profit
Mining intellectual value
I just realized that generative AI pushes the same buttons for me as Roy Lichtenstein (fuck that guy): an elite using the work of the plebs to enrich himself.
Lichtenstein claimed his works, which reproduced panels from comic books at bigger scale with minor changes, were fair use. “Lowbrow” works created by or for the working class exist to serve the elite’s needs; elites, whether in tech or art, feel entitled to the works of those “beneath” them because they believe what they create using it is more valuable than the original works. Comic art is not respected in the fine art world / by the elites; his works were “fine art” while the reference material was commercial pulp.
Likewise, corporations (and society in general) don’t value writing or art or craftsmanship, so they’re 100% on board with stealing the intellectual property of millions to make a product designed to put those same people out of a job. Generative AI models could not exist without training data; the “feedstock” of other people’s creations are integral to the production of generative AI software. Every new version of ChatGPT is better because it’s been trained on more unlicensed, unauthorized training data used without permission or compensation.
Replied to The Curse of Recursion: Training on Generated Data Makes Models Forget (arXiv.org) What will happen to GPT-{n} once LLMs contribute much…
Replied to Climate change is death by a thousand cuts by Andrew Dessler (The Climate Brink) [T]his is an example of non-linear climate…
Replied to Digital sharecropping by Nicholas Carr (roughtype.com) One of the fundamental economic characteristics of Web 2.0 is the distribution of production into…
Liked Everything Easy is Hard Again (frankchimero.com) Illegibility comes from complexity without clarity. I believe that the legibility of the source is one…
Liked Stop Using AI-Generated Images by Michelle Barker (CSS { In Real Life }) AI can be used by artists to augment their…
I’ve taken approximately four pictures of myself this year and I think this is the best — tidepooling in June I don’t have…
Invasive species disrupt ecosystems because they did not evolve in balance with the other species. Native species have adapted to fill specific niches,…