People and Community Science and Technology

Facebook: The State v. Social Media

What is the impact data mining will have on Facebook, on other industry giants and overall privacy rights?
Facebook

Saja Beslik, head of sustainable finance for Nordea Bank’s $400 billion wealth management unit, says one of Europe’s largest financial institutions won’t buy any more Facebook stock until the air clears around accusations that the popular social network gave away private user data to third parties for years without enforcing strict controls over how that information would be used.

“One-offs, fine. Usually that’s something that a company can manage in a responsible way,” Beslik told Bloomberg News. “But what we are worried about are the systemic issues.” 

Beslik is not alone in putting some distance between his money and the Menlo Park-based social media giant, which went public in 2012 and also owns another popular networking platform, Instagram. Recently, shares in Facebook lost almost 10 percent of their market value after the company was reportedly linked to a U.K. political consulting firm that improperly obtained information on millions of Facebook users in order to target American voters with propaganda during the U.S. presidential election in 2016. 

Lawmakers in the United States and the European Union are now looking into the relationship between Facebook and London-based Cambridge Analytica. Moreover, the board of Cambridge Analytica suspended its CEO Alexander Nix after an undercover media investigation showed Nix boasting about the company’s role in the campaign of President Donald Trump, who successfully tapped the data mining firm’s research, analytics and strategy to sway public opinion, according to Nix.

As a result of the accusations, Facebook CEO Mark Zuckerberg said his company is taking steps to address concerns raised by the reported breach, although the company has had longstanding terms of service for developers and privacy settings for account holders. 

At the heart of this case, however, is not only internal controls at Facebook per se, but systemic technology that integrates the infrastructure of social media into the fabric of our lives and the modern Internet. Interconnections between different social media platforms and third-party services enable consumers to share personal data, news articles or financial information that ease the flow of commerce between websites and end users.

These interconnections are based on what’s called application programming interfaces or APIs. Using APIs, app developers can create new tools that improve or deepen user experience. In return, developers gain access to consumer pools. While data sharing APIs raised privacy concerns as early as 2005, their adoption grew quickly as companies moved more and more activities online, sought ways to add value to the experience and ultimately reduce transaction costs.

It was an API feature called “friends permission” that enabled app developer Global Science Research (GSR) to retrieve Facebook data and give it to Cambridge Analytica in 2014, the same year Facebook decided to terminate the feature. GSR was headed by Cambridge University psychologist Aleksandr Kogan, who built a personality test for Facebook users. The test automatically downloaded the data of friends connected to people who took the test. While Kogan’s app only attracted 270,000 users, GSR reportedly exploited the permission feature to collect data on more than 50 million Facebook accounts.

So what does this potential scandal mean for Facebook, the industry and privacy rights in general?

First, the scrutiny and risk of regulatory action could affect business-as-usual at other major networks such as Twitter and Microsoft’s LinkedIn, as well as large companies outside the tech sector that make user data available to external developers through APIs.

Second, as an industry flagship, Facebook will be forced to strengthen or revamp its internal controls to address claims that it willfully ignored warnings that user data was at risk of outside exploitation. To that end, Facebook has hired a digital forensics firm to conduct a comprehensive audit of its ties to Cambridge Analytica, two years after Facebook first learned of the reported data breach.

Most importantly, as Facebook’s global reach continues to extend far beyond the United States to countries like India and Brazil, the role of corporate responsibility, sustainability and transparency take center stage. Virtual operating systems such as Facebook and Instagram are new cultural platforms in cyberspace that affect human behavior. Prior emphasis on growing the user base must now be reconciled with enforcing etiquette among global, mobile yet adolescent networks of interactivity. Internal controls should dovetail with a new industry bill of rights that would balance the competing needs of safeguarding personal data, innovating technologically while complying with universal norms of civil society. The industry has to close ranks like never before. 

Social media has matured to a point in history when its influence is increasingly interdependent with daily life in America. Operating systems serve as contemporary forms of trust and socialization. Gatekeepers like Facebook now have a semi-sovereign duty to not only promote the general welfare of virtual communities, but to some extent enforce a code of ethics or rule of law for everybody who lives there.

Daniel HicksDaniel Hicks is an adjunct faculty member in the University of Miami School of Communication.