…what we are witnessing is the wealthiest companies in history (Microsoft, Apple, Google, Meta, Amazon …) unilaterally seizing the sum total of human knowledge that exists in digital, scrapable form and walling it off inside proprietary products, many of which will take direct aim at the humans whose lifetime of labor trained the machines without giving permission or consent.
“This is effectively the greatest art heist in history.” — open letter co-authored by Molly Crabapple
“This whole “this is how humans learn so whats the difference” thing while stealing so much data to make billions for a few dudes is so insidious.” — Timnit Gebru @timnitGebru@dair-community.social
If we cannot come up with ways for A.I. to reduce the concentration of wealth, then I’d say it’s hard to argue that A.I. is a neutral technology, let alone a beneficial one.
Today, we find ourselves in a situation in which technology has become conflated with capitalism, which has in turn become conflated with the very notion of progress. If you try to criticize capitalism, you are accused of opposing both technology and progress. But what does progress even mean, if it doesn’t include better lives for people who work?
Talking about AI is talking about the future of work is talking about the future of society.
Meta CEO Mark Zuckerberg declared this the Year of Efficiency for the company.
It was time for them to buckle down and get leaner, get flatter, and get more optimized… Efficiency initiatives are all about doing the same (or more) with less.
And while sometimes that can be done purely through technology, *humans* often bear the brunt of efficiency initiatives.
Work intensification happens on two levels. First, there’s the amount and pace of work. In the case of layoffs and the euphemistic “restructuring,” that’s literally making up for the work that used to be done by one’s former colleagues by adding it to the remaining employees’ workloads. Second, there’s the type of work being done and its emotional or cognitive load.
Hard work, long hours, real commitment—that’s the recipe for moving forward. But it’s not as though it’s a temporary sacrifice for those who remain.
Waiting for the remaining workers at Meta and Amazon to unionize 😎 Not that my union was much help to me, but at least I had someone on my side.
The home is no longer seen as a space of personal expression or comfort, or as the backdrop of everyday life, but primarily as an investment and as an asset—meaning that enforcing one’s aesthetics is a financially detrimental decision.
If you see yourself in your space, it reinforces your identity, your sense of self. It helps you go out into the world feeling grounded and confident. When you don’t see yourself reflected in your space, a disconnect can happen — you feel like you’re living in someone else’s house.
There is a version of the world taking shape in a few very strange and rarified minds, minds so coddled by wealth and an almost Galtian removal from the travails of the masses that their existence borders on fairy magic … and that vision involves precisely none of us. […] It is a world in which we are simply not needed; content is created by AI, animated and voiced by AI, promoted and distributed by algorithms, consumed by automated subscriptions and mandatory pay-to-play purchases, pinned and pushed to the top of feeds, shunted into media ecosystems where a computer-generated, computer-voiced, computer-written Ellen exclaims with delight over an animated child-script programmed to perfectly perform a piano sonata, tracked and fed back into the algorithm in an infinite loop, bugs patched and code updated by AI, and, very possibly, actual organic human creations shoved to the bottom of the digital heap as inefficient, sloppy, and insufficiently vertically integrated.
I find it pretty interesting that when most other advancements in automation have arrived, the sales pitch has usually involved describing ways in which it will improve the lives of every day people as a kind of sugary treat to drown out the taste of a dystopian future. […] But with ChatGPT, literally the first thing I heard about it was a Reddit donkey-chorus of HA HA WHITE COLLARS ARE ALL REPLACED GET FUCKED. […] Which tells me, however fun a toy people are finding it to be, or however much no one likes writing their own cover letters or school essays, ChatGPT isn’t being sold to us directly at all, but to our potential employers in lieu of us.
Merely training on autocomplete has led to beta-AGIs that can outperform humans across a huge host of tasks, at least in terms of speed and breadth. This means the space of general intelligences is likely really large, since it was so easy to access. Basically some of the first things we tried led to it. That’s bad! That’s really bad!
This indicates that there may be creatable minds located far away from all the little eight billion points stacked on top of each other, things much more intelligent than us and impossible to predict.
And what is more dangerous? The atom bomb, or a single entity significantly more intelligent than any human?
What Works: A Comprehensive Framework to Change the Way We Approach Goal-Setting is not really a book about goal-setting. It’s not a book about achieving anything. It’s a systematic deconstruction of the stories that keep us hustling, striving, and always looking for more. It’s also a guide for reconstructing an approach to personal growth, planning, and productivity once we’ve shed those stories.
Loved this! So much writing about work doesn’t acknowledge the pressures of the system we are in, and how those can influence our priorities and practices in ways that are unhealthy and unfulfilling. This was a full excoriation of the effects of toxic individualism, capitalism, and the Puritan work ethic on our approach to productivity and goal-setting. It offers a framework for digging into the psychological barriers to making progress on what really matters to us, and both recognizing and resisting the draw of conformity to these systems.
“I want to help give structure and meaning to growth based on curiosity instead of achievement.”
“Every day is an opportunity to practice satisfaction rather than striving.”
And, before we dive deeper, a reminder that the very nature of the market is this. The exploitation and dehumanization of all of us for as much profit as can possibly be extracted from us. Preferably it could be done painlessly and with a smile, but the inherent philosophy at the heart of the process harbors deep, dark authoritarian energies that will come into full focus as soon as situations demand it.
There is a abhorrent logic to it. If adults aren’t going to accept these low-paying, backbreaking, soul-crushing jobs, and if they’re going to continue agitating for labor unions and better treatment, then somebody’s going to have to show up.
The GOP’s continued assault on teachers as “groomers” and “indoctrinators” is about destroying public education, but in due time that will switch to also rationalizing why children would be “better off” laboring rather than being “subjected to wokeness.”
👀
I don’t think it’s solely about destroying education, though that is one aspect. Part is about demonizing “the other” and creating in/out groups to turn against, especially conflating liberalism with queerness, which they also hate and fear. Part is about vilifying intellectual pursuits and devaluing critical thinking. And part is preventing kids from learning information that conflicts with their controlling doctrines.
They will wage not only culture war but also generational war, claiming degeneracy and decay demand a return to “traditional values,” including the reappearance of young people in the workplace, where they might learn the value of a dollar and the need for hard work.
I’m wary of reading too much stuff like this in case it’s alarmist reverse fearmongering, but I kinda don’t think it is — so learning to recognize and anticipate the behavior patterns of authoritarians is important 🫤
I started working at 14 and wish I hadn’t. Wish I’d given myself a few more years before I started squeezing myself into the mold of ideal worker. Our school system does enough of this already: teaching to the test, quashing curiosity, forcing kids to follow a schedule that doesn’t suit their bodies.
I recall a day I got in trouble for not coming to work on the school paper after track. I was seventeen. I’d been at school from 8 to 3:30 then practice till 6 or 6:30. I was exhausted, physically and mentally, and had homework to do plus saxophone practice. By the time I finished dinner I figured they’d be winding down and there was no need for me to go back, but apparently they worked till midnight. I “should have” gone back and worked another five or six hours.
Except in retrospect, maybe we shouldn’t ask kids to put in 15-16 hour days — for extracurricular activities or for paid work. That means recognizing that children’s work is learning, both the skills and curriculum of their classes, as well as how to be people. Supplanting kids’ free time with labor prioritizes their value as workers over their wellbeing as people.
It’s all part of a hustle for success mindset that, at least for me, started with high school, when I was 13. My parents didn’t push me, but society was a strong influence. You won’t get into a good college if you don’t do extracurriculars or score well on standardized tests. If you make mistakes, if you’re anything less than perfect, you’ll be a failure. I’m still working on purging toxic perfectionism from my system in my late thirties. And I wish I could have let myself enjoy being a kid a little longer.
In most communities, we have a box that we sleep in, a box we drive to the office or school in, and then, once we’re there, a box to work or study in… These places are often devoid of any ornamentation, idiosyncratic details, or contextual elements that would ground them in a specific community.
Our buildings and places symbolize what we value. They tell the story of who we are.
But what about when we don’t know who we are?
I suspect there’s a connection between the loss of Place-making and the dissolution of community ties.
Suddenly, it was “computer says no” everywhere you turned, unless everything matched perfectly. There was a global rush for legal name-changes after 9/11 – not because people changed their names, but because people needed to perform the bureaucratic ritual necessary to have the name they’d used all along be recognized in these new, brittle, ambiguity-incinerating machines.
Digital precision
We encounter this problem often in the digital world in things like content-limited text fields and binary choices on a form (or limited options that drive us always to “other”).
The digital world demands exactitude in a way analog doesn’t. I recall my dad, a TV station electrician, explaining the difference between analog and digital signal to me as a kid; I couldn’t understand why the squared shape of digital signal — either you get it or you don’t — would win out over more flexible analog signal, which has some allowance to receive lower quality signal rather than none.
Too, this inherent precision of digital information influences the way we think about data. We interpret numbers to be more meaningful than they are:
Excel-calculated results down to four decimals falsely imply confidence unsupported by the input data.
Recipes call for a specific baking time, when everyone’s oven is a little bit different, and environmental conditions affect baking time by impacting the moisture content of the ingredients.
Ad metrics and pageview data and likes that don’t translate truly to reach or brand recognition or conversions. (Like Internet celebs with millions of followers getting book deals that don’t translate to sales.)
I WAS ONE OF ABOUT SIXTY operators. Most of us were poets and writers with MFAs, but there were also PhDs in performance studies and comparative literature, as well as a number of opera singers, another demographic evidently well suited for chatbot impersonation—or, I suppose, for impersonating a chatbot that’s impersonating a person.
Let alone the present.
Each day when we reported for work one of them would hail us with a camp counselor’s greeting. “Top of the morning, my lovely Brendas!” they would say. Below their message, a garden of reaction emojis would bloom.
I am tired of the exploitation and undervaluation of emotional labor.
In the same way that algorithms tell us what they think we want, and do so with such tenacity that the imagined wants become actual, these buildings seemed intent on shaping a tenant’s aspirations. They seemed to tell the tenant they should not care about regional particularities or the idea of a neighborhood. The tenant should not even desire a home in the traditional sense, with hand-me-down furniture, hand-built improvements, and layers of multigenerational memory. This tenant was a renter for life, whose workplace was their primary address, and who would nevertheless be unable to afford property for as long as they lived.
Brenda, they claimed, said the same thing to everyone, which meant that she was incapable of bias. And yet she was awfully good at repelling certain people: people without smartphones or reliable internet, people unaccustomed to texting, people who couldn’t read or write in English, and people who needed to figure out if they could access a property before showing up for a tour. Brenda deflected them all with polite violence. She was not a concierge but a bouncer, one made all the more sinister for her congeniality and sparkle.
But the working conditions of data labelers reveal a darker part of that picture: that for all its glamor, AI often relies on hidden human labor in the Global South that can often be damaging and exploitative.
The work’s traumatic nature eventually led Sama to cancel all its work for OpenAI in February 2022, eight months earlier than planned.
An OpenAI spokesperson said in a statement that the company did not issue any productivity targets, and that Sama was responsible for managing the payment and mental health provisions for employees.
🙄 Of course they’re not responsible for the work they hired out.
Conditions for vendors are so much worse than employees, so of course that’s the direction companies want to move: cheaper labor that they aren’t liable for. Ethics has no part in corporatism.
“They’re impressive, but ChatGPT and other generative models are not magic – they rely on massive supply chains of human labor and scraped data, much of which is unattributed and used without consent,” Andrew Strait, an AI ethicist, recently wrote on Twitter. “These are serious, foundational problems that I do not see OpenAI addressing.”