Categories
Culture Featured Technology

Culture companies have forgotten how culture works

Hollywood executives have detached cultural works from cultural meaning, losing sight of the anchor of their business. They’re currently chasing the enshittification cycle down, down, down, dreaming that AI will allow them to cut all their costs (people) while pocketing even more profit because they’ll be able to produce endless “good enough” content.

Ed Zitron writes:

It’s somewhat cliché, but Hollywood is not concerned about creating interesting, or good, or unique content, but more content that can be used to make more things that can be used to make more profit to increase the stock price. It’s not about whether something’s good, or new, but whether or not it is marketable and “good enough” for consumers…

As Tim Carmody highlights, studios are barely entertainment companies anymore as they move into streaming, with the entertainment they make merely the hook for their real profit-centers. They make culture, but they value culture only insofar as it makes them money. The end game they envision is generating content for next to nothing; with an endless supply of content, everyone will find something good enough to watch, letting them maintain a vast customer base.

Towards that future, studios are self-cannibalizing their own industry by destroying career development for writers. They don’t value storytelling or recognize script-writing as a craft needing industry knowledge. As Dave Karpf writes, studios will satisfice their processes and products using AI if they can get away with it, accepting mediocre scripts as the price of profitability.

But.

Generative AI entertainment is a bet on algorithm-driven culture over people-driven

A sea of content that expansive prevents people from finding content for themselves, so it’s reliant on serving algorithmic recommendations specifically for each viewer. So far that’s worked well enough for TikTok every other company in the entertainment / ad space has been falling over themselves trying to copy them.

Except people currently decide what to read or watch in large part through word of mouth — referrals from other people they trust. People want other people, not an algorithm, to suggest what music they listen to. Even recs directly from strangers carry weight because we have shared context through whatever place we’re interacting — whether a Discord channel, a subreddit, or on social media — and humans’ impressive abilities to draw conclusions about what someone may or may not like from their twenty-word bio.

Gambling on good enough is a risky game for the studios, because people don’t recommend things they think are crap. And, speaking broadly, people won’t bother to watch things no one else watches. Social bonding and signaling that we are part of a group are big influences on what we choose to consume.

Most people on social platforms chiefly follow celebrities, which I didn’t understand until someone pointed out that interest in celebrities is a facet of our species’ social patterning with other humans and reflecting how they act. I often choose what to read based on Goodreads reviews, and as soon as I’m done reading, I’ll head to Goodreads to check out the spoiler posts and see if others shared my feelings about certain parts. Interacting with other people, and learning how others interacted with a piece of culture, is an important part of the cultural experience.

Culture is person-to-person, not personalized

Right now, zeitgeist shows and movies become cultural events. After watching a(n early season) Game of Thrones episode, we would pull up Reddit and see what everyone else thought about it; I hear the it show is Succession these days.

Word of mouth requires a minimum audience to take off — one person is not enough for buzz. If viewership is distributed between a broader and broader pool of content, there’s less chance enough people will watch something. TikTok employees juice views of certain content, influencing some works to go viral — a human hand behind the algorithm.

Personalized content can’t become culture when identity is derived from shared culture

One future people keep envisioning is tailor-made entertainment. The concept of personalized content, custom created for you, is not a market for culture. (Porn, maybe.) Culture extends beyond the media itself, to the way society interacts with it. Franchises have gotten people to buy into expressing their identity through the culture they consume; books too have become a signal of identity, part of a personal style. People can only do that if others understand what they’re communicating through their selection of cultural references.

As Claire Dederer, quoted on tor.com, points out:

We now exist in a structure where we are defined, in the context of capital, by our status as consumers. This is the power that is afforded us. We respond—giddily—by making decisions about taste and asserting them. We become obsessed with this thing, mega-fans of that. We act like our preferences matter, because that is the job late capitalism has given us.

It’s not enough for people to consume only; humans apply meaning, and act on it. Consumption has become part of culture because it’s visible and gains shared meaning.

The studios are gambling that an algorithmic approach of recommending personalized drek will beat out the way humans have chosen what to engage with culturally for as long as it’s been possible, because they don’t care about the culture they are producing. They think that personalized recommendations will beat out tastemakers and human relationships. But that’s been the downfall of social media companies lately: ignoring relationships (Facebook and Instagram burying our friends’ posts) and cultural trendsetters (driving away the clever, funny, interesting people on Twitter and instead boosting posts by boring, unfunny Musk fanboys).

Mashup culture relies on shared meaning

We live in a time where culture converges, where everything is mashup, where shared sources and meanings are at the core of culture. Everything must appeal to everyone when success requires eternal audience growth.

Zitron highlights this gap in executive thinking:

The business minds behind these organizations treat content as a commodity that one can simply create more of and sell advertising next to, rather than something that people actually like to consume and share.

Sure, people will sit around and watch TikTok for hours, but they’re sending their favs to their friends. They’re making reaction videos to other people’s videos, they’re participating in goofy challenges, they’re making Duets with strangers, they’re leaving comments. This is an interactive platform, where TV is not. What makes TV and movies social is watching them and talking about them with others — and there needs to be something there to talk about.

Book publishers are trying to ride BookTok and Bookstagram’s human recommendations to sales, but this uses the algorithm to amplify mass media — it won’t work for unique content. It only works for media that lots of people can watch.

YouTube, too, is different, because the format fosters parasocial relationships between viewers and the creators they follow. Without independent creators, flooded with generic auto-generated material, YouTube would get pretty boring pretty quick. The humans behind a video are essential to the engagement.

Cliche and franchise could probably carry AI generated works for a while, but in the long term, mashups are only engaging when they add something new, interesting, or clever — and that takes a human hand. AI mashups are entertaining people at the moment, but they were prompted by people. We’re going to quickly use up lowest common denominator mashups; copycats of “Balanciaga x Harry Potter” are already boring. Mashup culture moves fast, and if you want to compete in that space you need up to date training for your AI.

Generative AI is cool and fun because everyone is talking about it

Right now, generative AI is fun and exciting, so the bar is low for appreciation. Soon, that will change. Tastemakers will quickly move on from generated content without personality and heart. People will stop sharing their generated content on social media if no one engages with it; already, the screencaps on social media jump from product to product without loyalty, following the hip thing everyone is talking about. DALL-E graphics gave way to Stable Diffusion graphics (iirc), which gave way to Chat-GPT screenshots. A big part of the fun of generative AI is seeing how other people use it; the output is less interesting on its own, without the prompt attached.

Likewise, personalized content will be fun (if we actually get there) — but its longevity will depend on the extent that people can share it with others and use it to signal identity. If studios follow the path towards personalized generated content, their works will become the Buzzfeed Quiz version of entertainment.

More doesn’t always mean more

A pivot to AI relies on the foundational assumption that more content will let TV and movies win the attention battle with other forms of entertainment. Instead, flooding the marketplace with content will drive people towards word-of-mouth recommendations faster. Studios have burned a lot of trust with audiences already by cancelling shows and movies for tax write-offs or to avoid paying residuals; it’s a lot to then ask viewers to trust that shows written by generative AI will be worth their time. Right now viewers can’t even trust a show they liked will get a second season.

Studios will presumably still buy visibility for selected works, increasing the odds enough people will give it a try for it to take off, but that too is dependent on human selection: which shows or movies are worth granting an ad budget.

In the end, because studios still need audiences to “buy” their cultural products by watching them or subscribing to their platform, there is a limit to how bad those cultural products can be. Technology loves to “disrupt” things, but the oral, shared, trust-based foundations of culture are a tough sell. Only time will tell if generative AI really is good enough.

By Tracy Durnell

Writer and designer in the Seattle area. Freelance sustainability consultant. Reach me at tracy.durnell@gmail.com. She/her.

One reply on “Culture companies have forgotten how culture works”

The studios are betting wrong here. Invention of the gas engine (which moves faster than any human does) didn’t make marathons any less interesting or sport any less of a performance. Deeper Blue beat Kasparov decades ago and computers are better at chess than any human can be, but we still watch humans playing chess.

Leave a Reply

Your email address will not be published. Required fields are marked *