Enshittification, p.10

Enshittification, page 10

 

Enshittification
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)


1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31

Larger Font   Reset Font Size   Smaller Font  

  The most common tactic used to flout regulation is to break the law with an app and then insist that the law hasn’t been broken at all, because the crime was committed with an app.

  Sometimes literally (as Uber does when it argues that it’s not an employer because it directs its workers with an app) and sometimes figuratively. Tech-like apps can obfuscate what’s really going on, sloshing a coat of complexity over a business that allows its owners to claim that they’re not breaking the law. (“It’s not an illegal unregulated hotel, it’s an Airbnb!”)

  Riley Quinn, showrunner for the excellent Trashfuture podcast, says that whenever you hear the word fintech (financial technology), you should mentally substitute unregulated bank.

  App-based lending platforms ignore usury law and say it doesn’t count because they do it with an app. Cryptocurrency hustlers illegally trade in unregistered securities and say it doesn’t count because they do it with an app.

  When Uber entered the taxi market without securing taxi licenses or extending the workforce protections required under law, it said the move didn’t count because it did it with an app.

  The McDonald’s-backed company Plexure sells surveillance data on you to vendors, who use it to raise the price of items when they think you’ll pay more. In its promotional materials, Plexure uses the example of charging extra for your breakfast sandwich on payday. It says that such practices are not a rip-off because they’re done with an app.

  RealPage gives “recommendations” to landlords about the minimum rents they should charge for all the apartments in your neighborhood, raising rents and worsening the housing crisis. The company says it’s not price-fixing because it’s done with an app.

  On the subject of the housing crisis, Airbnb is racing to convert all the rental stock in your city into an unlicensed hotel room, but it says the conversion doesn’t count because it’s done with an app.

  As you’ll see later in this book (page 147), the legal regime for apps really is different from the rules governing web pages. Thanks to intellectual property laws that ban “circumvention,” companies that embed undesirable anti-features in apps can use the law to destroy rivals that disenshittify their offerings.

  In other words, tech companies don’t stop with “It’s not a crime if we do it with an app.” They also say, “It’s a crime if you fix our app to defend yourself from our crimes.”

  It’s Not Wage Theft If We Do It with an App: Uber’s Algorithmic Wage Discrimination

  Platforms mediate between business customers and end users. For Amazon, that’s sellers and buyers. For Uber, the business customers are drivers and the end users are riders. Like any other company that relies on a digital system, Uber can twiddle the knobs in its servers to make things worse for customers (like charging you more during busy times). But of course, the platform can also twiddle the knobs for its business customers: the drivers.

  Every time an Uber driver is offered a job, the wage for that job—dollars per mile and minute—is recalculated by an algorithm at Uber HQ. The goal of this algorithm is to lower the wage of Uber drivers. It works spookily well.

  The legal scholar Veena Dubal coined the term algorithmic wage discrimination to describe Uber’s labor-pricing tactic. Dubal came to understand algorithmic wage discrimination through her ethnographic work with rideshare drivers. She reports that drivers divide themselves into two groups: ants (who take every job the app offers) and pickers (who are selective, cherry-picking the jobs with the biggest upside). When Uber’s algorithm offers a job to all the drivers in your neighborhood, it calculates a different wage for each driver based on that driver’s recent behavior. If the driver has recently been picky, the system will offer a higher wage than it will to a driver who has been more ant-like. Pickers who take the bait will then be offered slightly lower wages in the future. The increments and frequency of these pay cuts are randomized so that it’s hard for drivers to recognize that they’re being squeezed, and of course, if a driver balks at a job and gets a little picky, then the wage starts to titrate up again.

  The Uber algorithm isn’t doing anything particularly clever here: this isn’t a mind-control ray that’s bypassing the drivers’ critical faculties and tricking them into giving up their other part-time jobs in favor of a lower-waged grind for Uber. The trick is simple, but it’s performed quickly and tirelessly.

  If algorithmic wage discrimination is ringing a bell for you, perhaps you’re thinking of when Facebook tricked publishers into posting longer and longer excerpts from their websites to Facebook, culminating in the full substitution of Facebook for their own independent, stand-alone web presences.

  Facebook did the same thing to publishers that Uber does to drivers. Most publishers who observed their posts garnering fewer reads on Facebook would poke around and find publishers that had solved this problem by being more generous with the length of the excerpts they posted to Facebook. But some publishers balked, worried that they’d be giving away the store if they posted longer teasers to Facebook. When that happened, Facebook’s algorithm—which determined whether a publisher’s subscribers would see the post, as well as whether it would be “recommended” in the feeds of users who hadn’t subscribed to the publisher’s account—would sometimes goose the publisher’s numbers, showing an old post or two to lots of readers, some of whom would click through to the publisher’s site. This was quite a convincer. Publishers that worried about Facebook replacing their website now saw that Facebook was actually sending tons of traffic to the site, and they posted longer, more frequent excerpts to Facebook.

  This is the same game that Uber plays with drivers, dribbling a few crumbs when a business customer—a driver this time, rather than a publisher—decides the juice isn’t worth the squeeze.

  Lots of platforms do this. In January 2023, Forbes reporter Emily Baker-White revealed the existence of TikTok’s “heating tool,” a secret back-end feature that TikTok strategists used to lure different kinds of creators onto the platform.

  Facebook started off with a default feed composed of accounts that you followed and then added in the odd recommendation. By contrast, TikTok went out of the gate with an algorithmic feed, called the For You feed. By accessing commercial surveillance dossiers of new users and then surveilling them further as they used the app, TikTok’s content recommendation algorithm was able to make really good guesses about which videos a user was likely to enjoy. TikTok does offer users the ability to follow other users on the platform, but the algorithmic feed is so central to how TikTok works that most users treat subscribing to an account as a way to hint to the algorithm that they want more things similar to the videos the account they subscribe to posts, not as a way of saying “Just show me what the people I follow are posting.”

  The social contract with TikTok, then, is that it will spy on you, but it will use that surveillance data to fill your feed with things whose existence you hadn’t suspected but that you find endlessly fascinating.

  The heating tool violates this contract. Sometimes a TikTok strategist will decide to woo a specific performer or kind of performer in a bid to get them to retool for TikTok. (TikTok’s idiosyncratic format and conventions make it hard for creators to make videos that work well on TikTok and on rival platforms, which means that TikTok has a mass of TikTok-first/TikTok-optimized performers.)

  The strategist identifies an account they wish to entice, and then applies the heating tool to that account. The heating tool pushes that performer’s content into millions of users’ feeds, irrespective of whether the content recommendation system would have “organically” recommended it.

  So, if TikTok decides there aren’t enough sports bros making content for the platform, a strategist can pick a random bro and make him king for the day, shoving his latest video into tens of millions of users’ feeds. The sports bro doesn’t know this—he just knows that he’s gone TikTok-viral, and whatever system he has for converting attention to money (supplements, sponsorship, etc.) is now ringing up gigantic profits. That sports bro declares himself to be the Louis Pasteur of TikTok. He trumpets his victory to other TikTok-curious sports bros and boasts of the unlimited riches waiting to be claimed by the bold sports bro who optimizes his video format for the platform.

  I call this the “giant teddy bear gambit.” When I was a kid, my family used to go to the Canadian National Exhibition, an annual fair with a traveling midway. The CNE opens in Toronto every summer around August 15 (my mother’s birthday) and closes on Labour Day (when we would all march with my parents’ unions in the Labour Day Parade, which ended inside the CNE, with free admission for all the marchers).

  The midway is lined with carnies roping for various games of skill that seem like they should be easy but turn out to be nigh impossible to win, like tossing five balls into a peach basket. But if you go down to the CNE, on any given day you’ll see some poor guy lugging around the kind of gigantic teddy bear you win only if you get all five balls in the basket.

  Now, that guy didn’t actually get five balls in the basket. To a first approximation, no one has ever gotten five balls in the basket. What’s happened instead is that a carny has flagged down a likely looking mark early in the day and said, “Tell you what, sir, since I like your face, I’m gonna make you a deal. You get just one ball in the basket, and I’ll give you one of these key chains. You get two key chains and I’ll let you trade ’em in for this giant teddy bear.” Of course, the carny’s not in the business of giving away giant teddy bears, but he understands that by dooming that poor sap to lugging around a teddy bear as big as he is all day long, through the muggy heat of Toronto in August, he will create an advertisement for his unwinnable peach-basket ball game.

  TikTok’s heating tool is a way for TikTok strategists to hand out giant teddy bears—and to take them back again. After all, TikTok users will tolerate only a certain amount of artificially promoted, irrelevant nonsense in their feeds, so if a TikTok strategist is satisfied that there is a sufficiency of sports bros locked in to the platform, it can withdraw the heat from its chosen sports bro and apply it to some astrologer, making her not only a queen for a day but also a Judas goat for other astrologer-influencers.

  The giant teddy bear gambit is one of the most powerful forms of twiddling. It allows Uber to keep its algorithmic wage discrimination machine humming smoothly. As Veena Dubal has documented, the forums frequented by Uber drivers are full of posts from drivers who are certain that they are “good at Uber,” who boast of the giant salaries they bring home from driving. Dubal’s ethnographic work includes heartbreaking interviews with drivers who drive until they can’t keep their eyes open, sleep in their cars, get back on the road—and then blame themselves for the pittances they take home. They don’t understand that their indiscriminate desire to please the algorithm by taking every ride that comes their way is actually signaling that they are easy pickings and can be enticed into driving for sub-starvation wages.

  Tech leaders point to this stuff and call it innovation. It’s more accurate to call it obfuscation. Except for the level of indirection introduced by the presence of an app, the algorithmic wage discrimination gambit would trigger labor law enforcement.

  The idea of titrating wages to employees’ desperation is hardly novel. Bosses since time immemorial have exploited bargaining power over workers to depress their wages. If you’re an employer and you know that all the other employers in your town are sexist jerks who won’t hire women (or racist jerks who won’t hire Black people), why not offer that perfect job candidate half the wage of her male (or white) colleagues?

  But the opportunities for pre–digital era bosses to suppress workers’ wages were few and far between, and comparatively crude. Wages were set when workers were hired, and adjusted only once or twice per year, thanks in large part to the impracticality of doing the paperwork to track a workforce’s ever-changing salaries without computers.

  The blackhearted coal bosses in old Tennessee Ernie Ford songs might have dreamed of changing coal miners’ pay from instant to instant based on how desperate the miners were, but not even the greediest coal boss was willing to fill a warehouse with all the accountants in green eyeshades needed to make those adjustments in paper ledgers.

  The point is that while algorithmic wage discrimination isn’t an innovative new way of doing business—and like every other shell game is just a simple trick done quickly—the fact that it’s done with an app lets the modern blackhearted coal bosses claim that they’re not violating labor law.

  Reverse-Centaurs and Chickenization

  In automation theory, workers are “centaurs” if they have some kind of tool that lets them do more than they could do on their own. For example, if your boss invests in bossware like Microsoft Office365 that counts your keystrokes and mouse movements and takes note of every link you click, then they are a centaur, accomplishing more than any human boss could, thanks to the software that automates nearly all of that work.

  Not every centaur is as sinister as a boss who spies on you with cloud software. A farmer with a tractor is a centaur, and so is a cashier whose register automatically adds up the groceries and calculates your change.

  There’s more than one way to partner a human with a machine. A reverse-centaur is a machine that uses a human to accomplish more than the machine could manage on its own.

  Take Amazon delivery drivers: Amazon outsources much of its delivery to third parties it calls Delivery Service Partners (DSPs). DSPs are typically manic entrepreneur types, bamboozled by Amazon’s promises of “being your own boss” while “partnering” with one of the most powerful, most profitable companies in world history.

  DSPs are responsible for buying a fleet of vans and kitting them out to Amazon’s exacting standards: not just an Amazon-branded paint job but also a bewildering array of sensors that track the van through time and space, noting sudden maneuvers and recording traffic around the vehicle. The sensors go inside the van, too, where AI-equipped cameras constantly monitor the drivers, down to the motions of their eyeballs and mouths.

  The drivers whose eyeballs are being so lovingly observed aren’t Amazon employees: they’re payrolled by the DSP. But in every regard save this one, Amazon is the boss. Amazon’s software tells the drivers what route to drive, sets (impossible) quotas for deliveries, and demands the dismissal of drivers who don’t live up to the standards it sets.

  This is why the roads leading to Amazon depots are littered with sealed bottles of human urine. There is no way for drivers to meet quota and keep their jobs if they’re stopping to pee, so, caught between a kidney stone and a hard place, they pee in bottles in the van, tightly screwing on the lids afterward (something you don’t forget to do twice). Amazon doesn’t like the bad press this generates, so it has ordered DSP operators to search returning vans for pee bottles, with punishments for drivers caught with evidence of having indulged the inescapable human need to eliminate their bodies’ waste, forcing the drivers to huck them out the window on the way back to the depot.

  In 2023, the UK prankster Oobah Butler harvested some of that urine, rebottled it, and offered it for sale on Amazon as Release Energy, a “bitter lemon drink.”

  Butler did the stunt for his documentary The Great Amazon Heist, which aired on the British public broadcaster Channel 4. In the special, Butler shows how his bottled piss snagged the favor of the Amazon recommendation algorithm, rising to become the top-selling “bitter lemon drink” on the platform.

  This took quite a lot of doing on Amazon’s part: Butler had listed Release Energy as a “refillable pump dispenser” (he didn’t have the proper food and drink licensing paperwork needed to list it as a beverage for human consumption), but Amazon thoughtfully shifted it into the “correct” category, and even more thoughtfully failed to ask for the paperwork showing that it was fit to drink.

  Not all of Amazon’s attention was automated. Once the bottled driver piss hit the top of the Amazon sales chart, an Amazon rep actually phoned up Butler to pitch him on using Amazon for his shipping and fulfillment.

  Butler never actually shipped a bottle of urine to a stranger. The only people who got a real bottle of Release Energy in the mail were Butler’s pals, who were in on the joke.

  Amazon never figured out the gag—at least, not until the documentary aired. Then it wheeled out a poor PR droid named James Drummond to recite boilerplate about how Amazon has “industry-leading tools to prevent genuinely unsafe products being listed.”

  When your drivers are pushed so hard by automation that they can’t even urinate, they aren’t centaurs, whose work is supercharged by high-tech tools. They are reverse-centaurs, humans who are used as inconvenient, fallible meat-puppets for a robot that demands superhuman feats of them, working them in ways the human body literally can’t withstand, until they are used up and discarded, and then replaced with other humans.

  (A leaked 2021 Amazon internal research memo warned that the company was burning out warehouse workers so quickly that it was in danger of using up every single eligible worker in the United States.)

  But Amazon isn’t merely in the business of turning its workers into reverse-centaurs. Before it does that, it chickenizes them.

  In labor economics, chickenization refers to a set of particularly ghastly labor practices originating in the poultry industry (hence the name). Poultry packing—like so many other sectors—is a cartel, with three companies dominating, each running in its own exclusive territory, meaning that farmers typically have no choice but to use the one packer available to them.

  Like Uber drivers, chicken farmers are classed as entrepreneurs who work for themselves—and like Uber drivers, chicken farmers are monitored and controlled by their true employer to a degree that exceeds all but the most abusive workplaces. And, as with Uber, these chicken platform owners claim they’re just brokers, not bosses, and have no obligations to the farmers who are their de facto employees.

 

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183