Skip to main content

No, Facebook does not just “show people what they want”

And “we’re just following the data” is a bad argument, anyway
Published onOct 28, 2017
No, Facebook does not just “show people what they want”
·
history

You're viewing an older Release (#1) of this Pub.

  • This Release (#1) was created on Oct 22, 2020 ()
  • The latest Release (#2) was created on Jun 08, 2021 ().

This started as a Twitter thread in response to some arguments about Facebook’s responsibility for its effects on society. I’ve expanded the argument here.

Okay, you want proof that Facebook doesn’t just “show people what they want?” I’ll give it to you. It’s known as clickbait.

I worked at Upworthy, which invented clickbait. Guess what? Despite the narrative in media circles, most people liked it, by almost every conceivable measure. We have the receipts: people didn’t just click, they spent huge amounts of time with content, shared it, and came back for more. Most people didn’t feel duped or manipulated when they clicked on our stories. If they had, we would have seen it in the form of enormous bounce rates and tiny share rates. We saw the opposite, mostly because our editorial team was very careful about making sure the stories delivered on the promise of the headlines.

But some people, especially people in media circles, hated clickbait. I understand why. The style was and is, frankly, grating. It was also easy to copy, and lots of unscrupulous folks who didn’t care about not being manipulative borrowed the tactic. So, media pundits spilled a lot of virtual ink criticizing it. They even tried to re-define words like “smarm” to describe it. In the end, the criticisms worked. Facebook decided to put a stop to clickbait.

But I’m not sure anyone ever stopped to ask why Facebook made that decision. After all, according to the data, their users loved clickbait. In fact, in those days Facebook representatives fairly regularly asked us if we would share insights, claiming we knew more about how users were responding to content than they did.

The ‘Ideal’ Feed

So what happened? Fortunately, it’s pretty easy to find out, because Mark Zuckerberg and then-VP of Product Christopher Cox told Mike Isaac and Peter Kafka. What happened is Zuckerberg and Cox read all of that criticism and became convinced that clickbait was bad for their users, even if those users didn’t know it. They decided that what their users really needed was “high quality” news, even if they didn’t signal to Facebook that they liked those stories. So they started a project to create the “ideal” News Feed, free of the tyranny of clickbait. A project which continues to this day.

Facebook Wants to Be a Newspaper. Facebook Users Have Their Own Ideas.

Most people think of Facebook in a similar way: It's a place to share photos of your kids. It's a way to keep up with…

allthingsd.com

Their immediate newsfeed changes cut Upworthy’s traffic in half overnight. Literally. We all remember the day — Thursday December 19, 2013 — when the graphs changed. Everyone else in media was hit by the change, too.

So, yeah, you can spare me the “Facebook just does what users want” bullshit. Facebook has decided over and over again what’s good for users based on the opinions of its executives. Now, in the wake of a devastating political event, we’re asking them to think about what’s good for society and suddenly, magically, that capability seems to have vanished.

Ironically, in some ways I think that shows growth. Looking back, it is crazy that in 2012 and 2013, Christopher Cox took it upon himself to decide what quality of news people needed, and what constituted quality news in the first place.

But it seems clear now that clickbait, and the ensuing fallout, was a tipping point. The changes that began as Cox’s quest to eliminate clickbait and create the “ideal” News Feed led to today’s ever-minimizing, ultra-personalized stream of content. And Facebook’s habit of using “just following the data” as a shield to avoid responsibility for its impact on global society started as a way to justify changes to the News Feed. In fact, Facebook started its News Feed FYI blog series in August 2013 specifically to message around those first changes. In that first post, the company explains that all the News Feed does is “let users decide” and “listen to feedback.”

Just Following the Data

As the story I’m telling here shows, the problem with that excuse is that it’s really hard to “just follow the data.” I work with scientists every day at my company Massive, and one of the things I’ve learned from them is that even the best scientists can unintentionally arrange data to fit their pre-existing beliefs. That’s why the scientific method and all of its safeguards —peer review, protocols, etc. — exists. To counteract the natural, human tendency to re-arrange the world to fit your beliefs.

If you read the AllThingsD article I mentioned above (and you really should — it’s remarkable), it describes exactly that: Zuckerberg and Cox trying to get the data about what kind of news users want to click on to fit their beliefs about what they think users should click on. There’s nothing wrong with that process on the face of it — in fact, it’s probably what I would do if I was in their position and believed clickbait was bad.

The problem is that you can’t search for fit while simultaneously claiming to be a neutral actor just “letting users decide.” Facebook’s product decisions have an enormous impact on user behavior, which creates the data they “follow.” When scientists “follow the data,” they put a ton of effort into controlling for the impact their presence has on the data they collect. Facebook doesn’t. It is essentially a giant, global, uncontrolled experiment run by people who seem uninterested in even understanding what they’re really doing.

Facebook, Nation

That’s a terrifying thought, but I’ve found a way to make some sense of it. I’ve come to believe that Facebook (and similar companies) are better thought of as nation-states than companies. I mean that quite literally. When I worked at another giant tech company, Google, I got more than a taste of the types of cultures they create.

Life at those companies is awesome, in the full sense of the word. So awesome — from smart people to free food to interesting, endless work — that it starts to subsume your life, especially if you’re young and it’s one of your first jobs. You start to feel an allegiance to the company that provides so much for you, and you start accepting more and more of the internal logic of the place. Don’t be evil. Our great mission is to organize the world’s information. Google+ is a good product (:trollface:).

At times during my tenure, I felt so aligned with Google that I found myself asking if I had more of an allegiance to my company or to my country. I frequently ran a thought experiment: if Google asked me to do something illegal (but moral —hey, it’s my thought experiment), but promised to whisk me away to Paris and take care of me forever if I did, would I do it?

On day one of Intro to International Relations (which I studied in college for some reason), you’re taught not to assume that other entities use the same logic as you when they make decisions. It’s one of the foundational principles of the field. You’re also taught that the personalities of individual leaders within organizations matter a lot more than history books usually indicate.

I think both of these principles apply well to big companies like Facebook and Google.

To us on the outside, it is crystal clear that if Facebook can influence purchasing decisions, it can influence voting decisions. But using Facebook’s internal logic, they’re “just following the data” and trying to give individual users exactly what they want. They don’t believe they really have an impact on their users’ behavior. And they are so focused on individual user needs that I suspect they have very little concept of the larger effects of the system and, yes, society they’ve created.

So, while it makes sense that an advertiser using Facebook can convince an individual to spend money on something, it probably seems crazy to many people on the inside that the platform could have had any impact on a national election at all.

This logic is driven by the personalities of the people in charge. Zuckerberg, in particular, seems to sincerely care about individual user needs above all else — remember the infamous squirrel news comment? And Cox was so moved by criticism that (in my admittedly slanted view) had little evidentiary backing that his team has spent the last 4 years fighting clickbait while missing the growing problem of deliberate misinformation almost entirely. The changes they continue to implement haven’t just impacted so-called clickbait producers, of course. They’ve hurt the entire news industry and, as we’re now finding ourselves arguing, the foundations of democratic society itself.

There’s actually some good news buried here: Facebook can be moved. But, like a good international relations analyst, make sure you understand how a foreign entity is likely to respond before you act.

Comments
0
comment
No comments here
Why not start the discussion?