Skip to main content

Influencer marketing, fascism, and the hypernormalization of culture

Everything is fine.
Published onMay 27, 2021
Influencer marketing, fascism, and the hypernormalization of culture

The latest Harper’s has bad news for those of us who decided to pursue conventional career paths:

Whereas the average college grad rakes in $50,000 a year, an influencer with more than a million followers could net that much in a month. And at a moment when virtually every other industry is shrinking if not flatlining, the influencer marketing industry is projected to be worth $15 billion by 2022 and currently accounts for roughly 15 percent of the total global ad spend.

The story, which follows an English professor’s mid-2020 visit to a TikTok “collab house,” where influencers are given free room and board in exchange for making content for brands, covers a surprising amount of territory, from economics to the diminishing role of the academy to politics.

One particular section stands out, a discussion about whether these young influencers have any kind of real power. The boys are obviously (if not consciously) conflicted, in one breath asserting that the government is afraid of the voice and influence TikTok gives them, and in another explaining that they never get involved in political discussions, lest engaging in controversial topics make them less attractive to the brands who pay their bills.

The piece’s author, University of Wisconsin professor Barret Swanson, reads this cognitive dissonance against Walter Benjamin’s 1935 book The Work of Art in the Age of Mechanical Reproduction, a tome that is shockingly relevant some 85 years later.

Benjamin suggests that fascistic governments aim to maintain the status quo by providing citizens with the means to express themselves aesthetically without reforming their lives materially. Thus the aforementioned government that Brandon thinks TikTokers have scared shitless actually, as Benjamin writes, “sees its salvation in granting expression to the masses—but on no account granting them rights.”

This reading feels mostly right, but the metaphor may be just a bit off. In this case, the fascistic governments in question aren’t nation-states (indeed, one of the only things our own government can agree on is that big tech needs to be regulated), but the tech companies themselves.

As I argued way back in 2017, ad-tech companies operate a lot like fascist regimes, both in the way they organize and motivate their employees, and in the way they police their platforms and users. They’ve built companies of enormous economic value based on the perception that, by spending money on their platforms, brands can change the behavior of users. Influencers, as the very name implies, are merely the latest, and most direct, way that platforms are keeping that perception afloat.

The youths Swanson interviews demonstrably do have the ability to influence behavior, often en masse. Successful TikTok videos can quickly become memetic, spawning not just tens of millions of views, but thousands of copycats and news stories. And this has real cultural power. Or at least, companies large and small are betting it does. As Swanson notes at the beginning of the article, brands are willing to spend increasingly large amounts of their ad budgets to harness this potential power.

And yet, as a Freakonomics episode from last year explored, there’s not a lot of good evidence that digital advertising, and even advertising as a whole, is particularly effective at changing behavior. At best, its influence on consumer behavior appears to be grossly exaggerated.

How can this be true, when at the same time it’s clear that social media platforms do have a huge impact on global life and culture? There are several likely explanations, but I believe Benjamin has the simplest and most impactful answer. Like the fascist regimes of his day, social media platforms have tailored their expression spaces to operate on purely aesthetic levels, on their terms. Within the carefully designed confines of the platforms, memes can go insanely viral and generate enormous amounts of attention, at least according to the metrics that the platforms also invented (and in some cases, inflate). This energy can and does spill over into other media spaces like TV, but once it does, it usually loses its momentum. It simply cannot survive contact with more material mediums.

Despite that crucial shortcoming and lack of real evidence, the idea that social platforms can influence behavior beyond their own boundaries has become reality to an ad industry already built on shaky epistemological ground. And digital ad platforms all-too-willingly feed this delusion because they have trillions of dollars at stake in maintaining the status quo.

Tim Hwang, a former Google employee who’s currently a research fellow at the Center for Security and Emerging Technology (CSET) at Georgetown University, likens it to a financial bubble in Freakonomics:

One of my favorite arguments that people are using right now is, “Companies wouldn’t put money into this if it didn’t work. So, isn’t that proof that the ads actually work?”

Which is kind of this crazy circular mind maze, if you think about it. But I do think that, again, this is very parallel to the kinds of psychology that have driven market bubbles in the past.

Because the same illogic applies to both everyday users and customers of social media, I believe that the platforms have gone beyond creating a bubble, achieving a sort of “hypernormalisation.” Coined by anthropologist Alexei Yurchak to describe the dissonance of living through the collapse of the Soviet Union and extended by BBC documentarian Adam Curtis, hypernormalisation is the self-fulfilling prophecy of accepting fakeness as normal for long enough that the facsimile becomes the reality:

There is a sense of everything being slightly unreal; that you fight a war that seems to cost you nothing and it has no consequences at home; that money seems to grow on trees; that goods come from China and don’t seem to cost you anything; that phones make you feel liberated but that maybe they’re manipulating you but you’re not quite sure. It’s all slightly odd and slightly corrupt.

Or, as Swanson puts it so eloquently in Harper’s:

The news has just broken that Oracle, an Austin-based company, was the final bidder for TikTok, ensuring that the app will continue to be available in the United States. Democracy is dying, the university is crumbling, and ash is raining from the sky. But for now, at least, the platform and the entire ecosystem it spawned has been saved. “To the best house ever!” the kids shout, and as their glasses begin clinking, my mind goes strangely quiet.

Who can really blame them? Somewhere along the line, what started as a way to find information, keep in touch with people and meet new ones, and share ideas became something more. Friendships, culture, movements, and careers blossomed from it. But so did hatred, violence, and destabilization. And yet, like quantifying the impact of advertising, making the causal link between culture online and behavior offline is notoriously difficult. Sure, the January 6 insurrectionists organized much of their activities on Facebook, and right-wing messaging on the platform almost certainly had an effect on their mistaken belief that the election was rigged. But the actual violence occurred on the ground, well away from the platform, and was stoked not by digital content but by an in-person rally by the then-President of the United States.

The result is that we all know that the lives we live on social media have become slightly odd and corrupt. We know that they are both extremely important within their self-contained aesthetic spaces and essentially meaningless beyond them. And yet, like bureaucrats in the last days of the Soviet Union, because we can no longer imagine life without them — because many of us make our living off of them (or believe we do) — we pretend that everything is fine.

Connections
1 of 1
A Supplement to this Pub
A corrective to the disinfo corrective
Description

Maybe the people who are trying to understand why conspiracies take root online are themselves part of a giant conspiracy to prop up the online advertising economy. Personally, I think the truth might be weirder and a bit less nihilistic.

Comments
0
comment
No comments here
Why not start the discussion?