acm-header
Sign In

Communications of the ACM

BLOG@CACM

Twiddler: Configurability for Me, but Not for Thee


View as: Print Mobile App Share:

Credit: NK Guy, nkguy.com.tiff

Originally published on Medium.com

Tracking Exposed is a scrappy European nonprofit that attempts to understand how online recommendation algorithms work. They combine data from volunteers who install a plugin with data acquired through "headless browsers," to attempt to reverse-engineer the principles that determine what you see when you visit or search Tiktok, Amazon, YouTube, Facebook or Pornhub.

At first blush, that might seem like a motley collection of services, but they have one unifying principle: they are all "multi-sided" marketplaces, in which advertisers, suppliers and customers are introduced to one another by a platform operator who takes a commission for facilitating their transactions.

Amazon introduces sellers to buyers, helps the former ship to the latter, and places ads alongside search-results and product pages. Tiktok, Youtube and Pornhub all do the same, but with performers and media companies who are introduced to viewers and advertisers whose ads are inserted at different points in the chain. Facebook brokers display of materials from a mix of professionals (artists, performers, media companies) and individuals (friends, family, members of an online community or interest group).

This kind of "platform" business isn't unusual. A big grocery chain sells its own products and products from third-party sellers, and does a brisk sideline in "co-op" — charging to place items at eye-height or in end-caps at the end of the aisles.

But online platform businesses have a distinctly more abusive and sinister character. To a one, they follow the "enshittification" pattern: "first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves."

Why are digital businesses more prone to this conduct than their brick-and-mortar cousins? One answer is tech exceptionalism: namely, that tech founders are evil wizards, uniquely evil and uniquely brilliant and thus able to pull off breathtakingly wicked acts of sorcery that keep us all in their thrall.

"Tech exceptionalism" is a charge that is more usually leveled at technology boosters, but it can just as easily be aimed at technology critics, who commit the sin of criti-hype by credulously repeating tech barons' claims of incredible prowess in their criticism: "Look at these evil sorcerers who have 'hacked our dopamine loops' and taken away our free will!"

There's another, simpler explanation for the enshittification of platform economics. Rather than trusting the self-serving narratives of the Prodigal Techbros who claim to have superhuman powers but promise that they have stopped using them for evil, we can adopt a more plausible worldview: that tech barons are ordinary mediocrities, no better and no worse than the monopolists that preceded them, and any differences come down to affordances in technology and regulation, not an especial wicked brilliance.

 

Tech exceptionalism is a sin, but digital is different.

The shell-games that platform owners play with surpluses, clawing them back from one group and temporarily allocating them to another, are not a unique feature of digital platforms — every business has dabbled with hiding costs from purchasers (think of "junk fees") and shafting suppliers (e.g., "reverse factoring").

The difference lies in the ease with which these tricks can be tried and discarded. The faster the shells move in the shell-game, the harder it is to track the pea.

If you're an analog grocer changing the prices of eggs, you have to send minimum-wage teenagers racing around the store with pricing guns to sticker over the old prices.

If you're Amazonfresh, you just twiddle a dial on a digital control panel and all the prices are changed instantaneously.

A platform operator can effortlessly change the distribution of surpluses in an instant, while suppliers and customers have to engage in minute, time-consuming and unreliable Platform Kremlinology just to detect these changes, much less understand them.

 

There is nothing intrinsically wicked about two-sided marketplaces or other "intermediaries" who serve as brokers between consumers and suppliers.

When I was a kid in Toronto, I frequently ran into Crad Kilodney, a notorious "street author" who wrote, printed, bound, and sold his books all on his own.

He sold his books from street corners where he stood for long hours, wearing a sign that said "Very Famous Canadian Author — Buy My Books" or "Margaret Atwood" (Atwood later memorialized Kilodney by standing at one of his usual spots with a sign around her neck reading "No Name Canadian Author").

Kilodney was one-of-a-kind, and I can still quote many of his stories and poems from memory, but even he didn't think that every writer should have to follow in his footsteps.

There are plenty of writers with interesting things to say who are unwilling or unable to print, bind, and sell their words directly to readers from a frozen street-corner.

The problem isn't the existence of intermediaries — it's how much power the Internet gives to intermediaries.

 

That power starts with twiddling those sliders and knobs that change search results, pricing, recommendations, and other rules of the platform.

Online performers know this well. If you're a Youtuber or a Tiktokker, you invest money and time into producing material for the platform, but you can't know whether the platform will show it to anyone — even the subscribers who explicitly asked to see it! — until you hit publish.

For an online creator, the platform is a boss who docks every paycheck and tells you that you're being punished for breaking rules that your boss refuses to explain, lest you figure out how to violate them without him noticing.

Part of Tracking Exposed's remit is to unravel these secret rules so that creative workers can avoid their bosses' hidden penalties. These secret rules were behind the #audiblegate scandal, where Amazon stole hundreds of millions of dollars from independent audiobook creators who used its Audible Content Exchange (ACX) platform to post their work.

Amazon hid the fact that it was clawing back royalties, withholding payments, and flat-out lying about its royalty structure. The key to hiding these financial crimes from Amazon's victims was velocity, the ability to change accounting practices from minute to minute or even second to second, allowing Amazon to stay one step ahead of the writers it stole from.

It's not just creative workers who get ripped off by digital platforms, of course. The "gig economy" is rife with these practices. Companies like Doordash want to criminalize tools that let drivers see how much a job will pay before they commit to it. Uber is a notorious twiddler of the driver-compensation knobs, exploiting the ease of changing pay structures to stay one step ahead of drivers. Sometimes, Uber overreaches and finds itself on the wrong end of a wage-theft investigation, but for every twiddle that draws a state Attorney General's attention, there are dozens of smaller twiddles that slide under the radar.

Twiddling allows platforms to rip off all kinds of suppliers — not just individual workers.

For independent sellers, Amazon's twiddling has piled junk fee upon junk fee, so that today, Amazon's fees account for the majority of the price of goods on Amazon Marketplace.

Advertisers and publishers are also on the wrong side of twiddling. The FTC's lawsuit against Facebook and the DoJ's antitrust case against Google are both full of eye-watering examples of high-speed shell-games where twiddling the knobs resulted in nearly undetectable frauds that ripped off both sides of the adtech market (publishers and advertisers) to the benefit of the tech companies (for an excellent breakdown of how Google twiddled the adtech market, check out Dina Srinivasan's appearance on the Capitalisn't podcast).

 

Twiddling is the means by which enshittification is accomplished. The early critique of Airbnb concerned how the company was converting every city's rental housing stock to unlicensed hotel rooms, worsening the already dire, worldwide housing crisis. Those concerns remain today, of course, but they've been joined by outrage over enshittifying twiddling, where homeowners are being hit by confusing compensation rules, and responding by imposing junk fees on renters.

Undisciplined by competition or regulation, the platforms can't keep their fingers off the knobs.

Remember when Facebook conducted its infamous voter turnout experiment ? Sixty-one million FB users were exposed to a stimulus the company predicted would increase voter turnout.

The resulting controversy was an all-too-typical exercise in tech criticism, where both sides completely missed the point. Facebook's defenders pointed out that this kind of experiment was a daily experiment for Facebook's knob-twiddlers who adjusted the platform rules all the time.

Rather that focusing on what a fucking nightmare it is for 3,000,000,000 people to be locked into having their social lives mediated by tech bros who couldn't stop twiddling the knobs, the critics of the Facebook experiment focused on the result.

It was textbook criti-hype. The Facebook experiment increased voter turnout by 280,000, which sounds like an impressive figure. But the effect size is only 0.4% (remember, the experimental group had 61 million users!).

Rather than focusing on how badly Facebook's ads perform (and how advertisers are getting overcharged), or how the company's compulsive twiddling changes the rules constantly for tens of millions of users at a time, critics of the Facebook voter turnout experiment instead promoted Facebook's ad-tech market by repeating Facebook's hype around this unimpressive result.

 

There's a bitter irony in enshittification: the Internet's great promise was disintermediation, but the calcified, monopolized internet of "five giant websites, each filled with screenshots of the other four" is a place where intermediaries have taken over the entire supply chain.

As Douglas Rushkoff puts it, the platforms have "gone meta" — rather than providing goods or services, they have devoted themselves to sitting between people who provide goods and services and people who want to consume them. It's chokepoint capitalism, a market where the intermediaries have ceased serving as facilitators and now run the show.

 

The double irony is how the platforms seized power: by installing so many sliders and knobs in the back-end of their services that they can twiddle away any temporary advantage that business customers, advertisers, or end-users take for themselves.

The early Internet promised more than disintermediation — it also promised endless configurability, where users and technologists could install after-market code that altered the functioning of the services they relied on, seizing the means of computation to tilt the balance of power to their benefit.

Technology remains intrinsically configurable, of course. The only kind of computer we know how to build is the universal, Turing complete Von Neumann machine, which can run all the software we know how to write.

That's how we got things like ad-blockers, the largest boycott in world history. The configurability of technology is why things like free and open software are politically important: in a technologically mediated society, control over the functions of the technology you rely on is control over every part of your life — your job, your education, your love life, your political engagement.

While it remains technically possible to reconfigure the technologies that you rely on, doing so is now a legal minefield. "IP" has come to mean "any law that lets a company control the conduct of its competitors, critics, or customers," and that's why "IP" is always at the heart of maneuvers to block platform users' attempts to wrestle value away from the platforms.

When Facebook wants to stop you from reading your friends' posts without being spied on, it uses IP law. When Facebook wants to stop you from tracking paid political disinformation, it uses IP law. When Facebook wants to stop you tracking the use of Facebook in fomenting genocide, it uses IP law. When Facebook wants to stop you from re-ordering your feed to prioritize posts from your friends, it uses IP law.

The platforms don't just twiddle with every hour that God sends, they also hoard the twiddling — twiddling is for platform owners, not platform users.

 

The enshittification of the Internet has three interlocking causes:

  1. Platforms were able to create vertical monopolies by buying their competitors and suppliers, so users have nowhere to go;
  2. Platforms were able to block regulation that would give users more power, and encourage regulation that prevents new companies from entering the market and competing for users by giving them a better deal;
  3. Platforms were able twiddle their own rules constantly, staying ahead of attempts by business customers (performers, media companies, marketplace sellers, advertisers) and end-users to claim more value for themselves.

To unwind enshittification, we need to throw all three of these mechanisms into reverse:

Digital tools could be a labor organizer's best friend. They could give users and device owners more flexibility and bargaining power than their offline predecessors.

As has been the case since the Luddite uprisings, the most important question isn't what the technology does, it's who it does it for and who it does it to.

 

The trick is to create rules that are both administratable and easy to comply with.

One challenge for regulating platforms is that they are complex and opaque. To a first approximation, everyone who understands Facebook works for Facebook (this also used to be true of Twitter, but today it's more likely that everyone who understands how Twitter works is a bitter ex-employee who is only too eager to puncture the company's bullshit, which opens up some tantalizing regulatory possibilities).

That means that when Facebook seems to be cheating, it will be hard to prove. It could take years to get to the bottom of seeming rule violations. For a rule to work effectively, it should be easy to figure out if it's being obeyed.

The other dimension to pay attention to is compliance costs. A regulation that is so expensive to comply with that it prevents small companies from entering the market does monopolists a favor by clearing the field of potential competitors before they can grow to be a threat.

That's what happened in 2019, when the EU proposed mandatory copyright filters as a way of preventing infringement on big platforms like Youtube and Facebook, as a way of shifting power from the platform operators to the media companies that relied on them.

In the end, Youtube and Facebook supported the proposal. This may seem paradoxical, but it makes more sense once you realize that Youtube's already spent $100,000,000 on its Content ID filter system, so any new regulation that forces new companies to have enough money to build their own filters is a bargain. If the table stakes for hosting content in the EU starts at $100,000,000, Youtube and Facebook can sew up the market without worrying about upstarts coming along and offering a better deal to creators.

Today, the EU's filter rules — and other intermediary rules that assume the Internet will always be dominated by a handful of giants, like rules requiring services to scan for harmful content, extremism, and hate speech — present a significant challenge to the spread of the Fediverse, which seeks to replace giant, twiddle-addled multinational corporations with human-scale services run by small businesses, co-ops, volunteers, and nonprofits.

Thankfully, operating a server is much safer in the USA, thanks in large part to Section 230 of the Communications Decency Act, which is often erroneously smeared as a gift to Big Tech, but which really protects the small fry who are often a better deal for platform users, and who are in any event unable to lock their users in when they want to offer a worse one (that's why Mark Zuckerberg wants to get rid of Section 230).

 

You may have heard that "Nathan," the volunteer operator of mastodon.lol, a server with 12,000+ users, has announced that he is shutting down his server because he doesn't want to deal with the acrimony over the new Harry Potter game.

This may seem like a serious problem with replacing Big Tech with small tech — what happens if you rely on a server whose owner turns out to have different interests from your own, leaving you stranded?

This is a question that many Big Tech users have had to grapple with, of course, thanks to Twitter's takeover by a mercurial, insecure manbaby who is bent on speedrunning the enshittification cycle.

The reality is that mastodon.lol's 12,000 users are much better situated that the 450,000,000 who were reliant on Twitter prior to the takeover. Mastodon is designed to prevent lock-in, and Mastodon users can easily export the list of people they follow, and the list of people who follow them, and import them onto a new server. With just four steps, a Mastodon user — including a user of mastodon.lol — can leave a server and set up on a new one, and keep all the connections they depend on.

This is so straightforward, so useful, so resistant to enshittification, such a great check against excessive twiddling, that we could even make it a regulation:

If you operate a server, you have an obligation to give any user — including a user you kick off the server — their data, including the data they need to get set up on another server.

That's a rule that's both easy to administer and easy to comply with.

It's easy to tell if the rule is being followed. If one of Nathan's 12,000 mastodon.lol refugees claims that they haven't been given their data, Nathan can disprove the claim by sending them a fresh copy of that data.

That's a rule that Nathan — and every other Mastodon server operator, small or large — can comply with, without being unduly burdened. All Nathan needs to do is not switch off the export function already built into Mastodon, and save users' data for a reasonable amount of time (say, 12 months) after he winds down his service so that he can provide it to users who didn't snag their data before he pulled the plug.

This is a rule that could be imposed on big services just as readily as on small ones. If we ordered Twitter to allow users to move freely from Twitter to the Fediverse — either as part of a new regulation, or as a settlement in one of the many enforcement actions that have been triggered by Twitter's reckless, lawless actions under Musk — we could easily tell whether Twitter was abiding by the rule. What's more, adding support for an open standard — ActivityPub, which underpins Mastodon — to Twitter is a straightforward technical exercise.

Enshrining this Freedom Of Exit into platform governance accomplishes many of the goals that our existing content regulations seek to attain. Rather than protecting users from hate speech or arbitrary disconnection by crisply defining the boundaries of both and building a corporate civil justice system to hear disputes, we could just let users leave when they disagree with the calls that companies make, and provide them with an easy way to set up somewhere else when a platform kicks them off.

That is, rather than making platform owners better, or more responsible, we could just make them less important.

The goal isn't no intermediaries, it's better ones, and easy movement from bad ones to better ones. The problem isn't that platforms do some twiddling — that's how they get better as well as how they get worse — but if platform users can't twiddle back and if they can't leave, they'll get twiddled to death.

 

Cory Doctorow (craphound.com) is a science fiction author, activist and journalist. He is the author of many books, most recently RADICALIZED and WALKAWAY, science fiction for adults; CHOKEPOINT CAPITALISM, nonfiction about monopoly and creative labor markets; IN REAL LIFE, a graphic novel; and the picture book POESY THE MONSTER SLAYER. His latest novel is ATTACK SURFACE, a standalone adult sequel to LITTLE BROTHER. In 2020, he was inducted into the Canadian Science Fiction and Fantasy Hall of Fame.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account