Opinion: Big Brother, Digitized

As technology evolves, concerns about data privacy on social media have grown ever-present.

Kat Klinefelter/Snapchat

Surveillance capitalism poses a danger to online spheres and privacy throughout all social media.

Aris Pastor, Co-Editor-in-Chief

The first time I heard the concept behind BeReal, I was skeptical. I never did download the app, but the name itself seemed like an ironic joke—“being real” on a social media platform that’s meant to publicize users’ lives. 

For those unfamiliar with the app, the premise is that users are prompted to take an unfiltered picture at random times throughout the day using both the front and back camera to show the space around them. The picture is then shared with the user’s friends network or other BeReal users. 

Like so much of the social media used today, I found that BeReal seemed to file down the very concept of privacy. The app became yet another eye into private lives, another room for the panopticon of the internet to stare into. Social media is a deluge of information that reframes our perception through the lens of others’ judgment, and often, selfhood is the sacrifice one must make to fit within the social boundaries created by a sense of constant surveillance. 

As Izzy May from The Daily Northwestern wrote, “By publicizing our inner lives so frequently, people feel increasingly entitled to be a part of it. We plunge deeper into a cesspool of time stamps and photographs, spending hours studying other people’s little moments and losing sight of ourself in the expedition.”

However, privacy concerns within the lens of BeReal, as well as other social media like Instagram or Snapchat, is not simply limited to the social atmosphere it creates. BeReal, for example, is allowed to “host, store, reproduce, modify, adapt, display, publish, edit, distribute and sublicense all or part of the Content… to conduct marketing, communication or commercial promotion activities of BeReal” for up to thirty years. Instagram, WhatsApp, and Facebook were all fined millions of dollars over privacy law violations in the past year. Even before Elon Musk bought Twitter, the app faced a $150 million civil penalty for violating their privacy policy.

Which begs the question: what exactly motivated these breaks in privacy? 

The answer: advertising. 

Shoshana Zuboff, professor emerita at Harvard Business School and author of The Age of Surveillance Capitalism, researched the exploitation of personal data by media corporations. Zuboff coined the term “surveillance capitalism,” in 2019, defining it as “the unilateral claiming of private human experience as free raw material for translation into behavioral data.”

She detailed the selling of such behavioral data—how long we spend on apps, what we look at, even biometric data like facial expressions or eye movements—to advertisers. Social media activity is a valuable resource. It shows trends, fixations, and behaviors that companies can track and react to in their marketing campaigns. Algorithms have grown, no doubt due to the need for digitization in the wake of the pandemic, and by now, they are built for making money, whether directly, through advertisement, or indirectly, in keeping our attention on them as long as possible. 

It’s become a meme how personalized the TikTok algorithm is, but in reality, it’s chilling. The TikTok recommendation algorithm presents the strength of prediction that social media can have, and up to one billion users have fallen down the recommendation holes the app presents. 

“[F]rom insurance to automobiles to health, education, finance, every product described [is] as ‘smart’ and every service [is] described as ‘personalized,’” Zuboff told The Harvard Gazette. “By now it’s very difficult to participate effectively in society without interfacing with these same channels that are supply chains for surveillance capitalism’s data flows.”

Surveillance capitalism doesn’t limit itself to advertisement, either. The use of online controversy and culture wars to influence advertising has been noted and used by companies, but social media has grown to modify behavior in less obvious ways, as well. Experimental development in Facebook’s contagion experiments showed this behavioral shaping at work, as did augmented reality games like Pokémon Go. 

As John Naughton of The Guardian wrote, “It is no longer enough to automate information flows about us; the goal now is to automate us… As one data scientist explained to me, ‘We can engineer the context around a particular behavior and force change that way… We are learning how to write the music, and then we let the music make them dance.’”

The 2018 Facebook/Cambridge Analytica scandal is the perfect case study for how surveillance capitalism reaches beyond advertisement and into politics. In 2015, Cambridge Analytica (CA), a political consulting company that specialized in influencing voters, obtained access to 87 million Facebook users’ personal data. A Cambridge University social psychologist, Aleksandr Kogan, and his colleague, Alexander Nix, then developed a model that mined the data from Facebook and predicted the personalities of all adult US citizens in the sample. This data was then passed to CA, which worked for Ted Cruz and, later on, Donald Trump’s 2016 presidential election campaign. Steve Bannon, who was set to become Trump’s White House Chief Strategist for the first seven months of his presidency, was on CA’s board, and Robert and Rebekah Mercer, Republican Party supporters, backed CA financially. 

This scandal reached beyond United States borders, too. In the United Kingdom, the pro-Brexit campaign allotted 40% of its budget, totaling £2.7 million (about $3.4 billion), towards its advertisement campaign. Data from CA was used to target ads at individuals, shifting those more likely to vote for Brexit to be sent more fake news and customized advertisements. 

In conditioning public behavior via subliminal cues, rewards, and punishments, social media has grown to become undemocratic, throwing into question how much autonomy and human agency technology truly allows us. 

“Democracy is also eroded from without, as surveillance capitalism represents an unprecedented concentration of knowledge and the power that accrues to such knowledge,” Zuboff said. “It’s the knowledge that they have produced from information [that we gave them] that constitutes their competitive advantage, and they will never give that up. These knowledge asymmetries introduce wholly new axes of social inequality and injustice.”

There are laws that defend data privacy. For example, Snapchat’s parent company agreed to a $35 million settlement to a class-action lawsuit surrounding biometric data stored without permission. The Illinois Biometric Information Privacy Act protected users in this case. This statute states that any biometric data stored must be disclosed and consented to by users.

However, most people don’t bother to read privacy policies, and often, if you don’t agree to the terms and conditions set by social media applications, you will have limited or no access to the app. BeReal allows a right to limit the processing of personal data, but only after submitting a complaint so that the company can “balance [their] legitimate interests against your request.”

Creating laws that tighten privacy policies will be a long, hard battle. Corporations that rely on advertisement, social media companies, and politicians all have their own reasons towards keeping surveillance capitalism the way it is. 

“Demanding privacy from surveillance capitalists,” Zuboff told The Guardian, “or lobbying for an end to commercial surveillance on the internet is like asking old Henry Ford to make each Model T by hand. It’s like asking a giraffe to shorten its neck, or a cow to give up chewing. These demands are existential threats that violate the basic mechanisms of the entity’s survival.”

However, Zuboff does have a plan for shifting law in the right direction. She encourages a mass change in public opinion to spark the development of new laws and regulatory institutions to address surveillance capitalism. Zuboff also stated that advocates for tighter privacy laws would need to find the right arena, or test case, to argue in service of. In fact, she recently noted in a talk with Politico journalist Mark Scott that the Twitter debacle and Elon Musk’s recklessness may be what advocates are looking for. 

“There are a lot of people now in the U.S. Senate and U.S. Congress who really have a trenchant understanding of the issues,” Zuboff said. “There is beginning to be a critical mass in Washington that not only wants muscular privacy laws, but also is looking more deeply at the whole landscape.”

_________________________________________________________

Editors’ note: All opinions expressed on The Uproar are a reflection solely of the beliefs of the bylined author and not the journalism program at NASH.  We continue to welcome school-appropriate comments and guest articles.