We Are Lab Rats in an Experiment that Nobody Controls - Issue #6
On the Netflix documentary "The Social Dilemma" and our subservience to the technology we created.
In 1966, in the inaugural issue of the journal Computers and Humanities, Louis T. Milic, one of the earliest of practitioners of what would become known as Digital Humanities, wrote an essay called “The Next Step” in which he imagined a “new age” when humanists would collaborate with computers. The essay is remarkably prophetic, foreseeing an academic field that was still decades in the making. Milic acknowledges, “We do not yet understand the true nature of the computer,” and his main concern is the way that computation might affect the way scholars think. He insists that the humanities scholar must understand the inner workings of the machine in order to use it. He writes, “Control comes from understanding, from a fusion of the user and the instrument, like the arm and the saber, the rider and the mount.” The scholar must master the computer’s logic; Milic insists, “Its intelligence and ours must be made complementary not antagonist or subservient to each other.”
I thought of this essay recently as I watched the Netflix documentary “The Social Dilemma.” First, I should say that I highly recommend it. It’s a bit different from most documentaries in that includes a fictionalized narrative element running throughout—some reviewers compare this aspect, not unfairly, to an after-school special. It is a bit hokey in that way, but also, I’ve found as I think back on the film, it is those narrative moments that really stick with me. In one such recurring sequence, Vincent Kartheiser (you know him as Pete Campbell from “Mad Men”) plays an anthropomorphized algorithm (actually there’s three of him; it’s weird), and the Pete/algorithm is manipulating an avatar of a teenage boy, feeding him notifications for videos, tagged photos, and advertisements throughout the day. The algorithm is jealous of the boy’s time, as when the boy tries to resist using his phone for a week. Spoiler alert: the algorithm wins.
That’s the narrative aspect of the film; the documentary aspect features former employees and executives of all the major Silicon Valley firms including Facebook, Google, Instagram, Pinterest, Twitter, Uber, and so on, engaging in a massive, dramatic mea culpa. Each talking head details the way their former companies traffic in our attention. There’s a discussion about the actual product that social media companies sell—perhaps you think it’s advertisements; it’s not, it’s us. Or, rather, it is the ability to change our behavior. Social media users are lab rats, subject to behavioral experiments by companies that make money by manipulating us and sustaining our attention. The famous quote from Edward Tufte says it best: “There are only two industries that call their customers ‘users’: illegal drugs and software.”
I know that we all know this, but we get something out of the deal too, namely, connection. Before the pandemic, I had mostly stepped away from social media. I deleted Twitter, Facebook, and Instagram from my phone and it felt great. I turned off all news notifications except for New York Times breaking news, and in our house we subscribed to print magazines and the weekend paper edition of the Times. This worked really well for the better part of a year, until the quarantine. Slowly, I found myself itching to know what other people were up to. I fired up Facebook or Instagram in my phone’s browser from time to time. Mostly, I saw that I wasn’t missing much, in part because I had rigged my Facebook timeline to show me news before posts from friends. Twitter was a bit different, however. Signing back in there, I had serious FOMO. A lot of academic chatter (I almost called it discourse…it’s not) happens on Twitter and I felt that I had been missing out. As the months went on, I found myself on social media more and more and I suspect many of you have as well.
In light of this, it’s perhaps not surprising that The Social Dilemma has been so popular; it’s consistently been in Netflix’s Top 10 since it debuted in September. Again, that social media is manipulating us, that it turns us “users” into products is probably not surprising—though the extent to which this is done may be. But this where I return to Louis T. Milic, who acknowledged—in 1966—that we don’t fully understand the machine, but he envisioned a future in which we would understand and thus be able to collaborate as equal partners with computers. It turns out, we actually understand less now than we did then, and we are not collaborators, not equal partners, but subjects of the technology we have created.
About halfway through The Social Dilemma, a series of the interviewees reveal that the algorithms that the social media companies use have become so complicated, you could call them intelligence.
Cathy O’Neil, data scientist and author of Weapons of Math Destruction says that algorithms are “opinions embedded in code.” She reminds us, that “algorithms are not objective.”
The goal of the algorithms is to sustain our attention, and Jeff Siebert, former executive of Twitter says, “no one really understands what [algorithms] are doing in order to achieve that goal.”
And Bailey Richardson, formerly of Instagram, says “the algorithm has a mind of its own…the machine changes itself.”
Finally, Sandy Parakilas, former manager at Facebook and Uber, tells us, “there’s only a few people who understand how those systems work, and even they don’t fully understand what’s going to happen with a particular piece of content.” What does this mean? Parakilas makes it frighteningly clear: “as humans, we’ve almost lost control over these systems because they’re controlling…the information that we see, they’re controlling us more than we’re controlling them.”
In 1966, Milic saw the potential for good that would come if the computer’s intelligence and ours could be “made complementary not antagonist or subservient to each other.” Certainly a lot of good has come from that complementary relationship, but when it comes to social media—the way so many of us interact most with technology every day—we have ceded control. We’ve allowed artificial intelligence, programmed by people whose only real interest is to increase their own wealth, to manipulate us and to turn our attention into assets to be bought and sold.
This feels like the point in the essay where I make a grand proclamation about what I intend to do about it and what I think you should do too. But I’m not going to do that. I don’t know what to do. I do know that just this past week the heads of Facebook, Google, and Twitter were called to testify (again) in front of Congress about their company’s role in the dissemination of information—real and fake—during this long election season. I know that the stakes go beyond our personal lives and to the very fabric of our democracy. I know that I don’t like being manipulated; I don’t want to be a lab rat. But I know that I am a lab rat; I’m a user, and I’ve been a user for nearly twenty years. I don’t know if I have the willpower it takes to get clean.
Mental Health Break
Here’s a picture taken from my office looking down at the back of the house this morning. And they said we’d have no accumulation along the coast…
What I’m Listening To:
So much writing is happening this week, which means music without words. It’s also been rainy and stormy and (today) snowy, so dramatic post-rock bands have been the soundtrack. This means my old buddies Caspian, of course, but also (and mostly) one of Caspian’s earliest influences, the Japanese band MONO. If I’m not mistaken, MONO was the first non-local band that Caspian opened for way back in, maybe, 2005? That time in my life is a bit…blurry. Anyway, here’s some MONO:
What I’m Reading:
One of the things I’m working on is an upcoming talk about the importance of narrative in these pandemic days, and to that end I’ve been reading Duke University professor Priscilla Wald’s important 2008 book Contagious: Cultures, Carriers, and the Outbreak Narrative.
Jason will be back next week…