Tag Archives: Cambridge Analytica

Facebook COO Sells $23M in Shares, Company Declines to Attend House Hearing on “Social Media Filtering Practices”

Facebook’s chief operations officer, Sheryl Sandberg, recently sold $23 million in the company’s stock on Wednesday as governments in the EU move to quickly implement new online privacy laws that would significantly limit the social network’s advertising practices and thus its income.

Sandberg is arguably one of the most powerful and influential women in technology. As Mark Zuckerberg’s COO and the head of the company’s advertising operations, she has been recently blasted by experts for her role in the Cambridge Analytica scandal. She has since profusely apologized once news of her involvement was made public. However, despite usually being comfortable in the spotlight, Sandberg has retreated from center stage amid the legal probes Facebook is currently facing, resulting in Mark Zuckerberg’s solo appearance before Congress last week.

In a string of appearances scheduled before the congressional hearings, Sandberg— the social media site’s 2nd in command— affirmed that Facebook’s main source of income comes from advertising. In other words, collecting data of its users is how and why the service remains free.

”The service [Facebook],” Sandberg reminded the public in an interview last Friday, “depends on your data.” Completely opting out of data-based targeted ads, she asserted, would have to be a paid option.

Experts have been quick to analyze and point out the aggressiveness of Facebook’s data collection practices, especially surrounding shadow profiles, which can collect data on users even if they don’t have an account with the social network. Before Congress, Mark Zuckerberg flatly denied any knowledge of the shadow profiles, even though the practice has been well-known since 2013 when the company’s data collection on non-users was revealed during a similar data-mishandling ordeal.

[Related: Facebook Dodges New EU Privacy Regulations]

Regarding the responsibility of the current misuse of data and future regulation regarding people’s privacy, Sandberg has been almost overly apologetic. However, it is still unclear what steps the company has taken since the story first broke in March.

“We know that we did not do enough to protect people’s data. I’m really sorry for that,” she’s said. In a separate instance she apologized yet again saying, “This was a huge breach of trust. People come to Facebook everyday and they depend on us to protect their data, and I am so sorry that we let so many people down.” She couldn’t promise that data was complete safe for now, adding that “We are going to find other things” and “there will always be bad actors.”

Sandberg would not comment about if anyone had lost their jobs at Facebook because of the scandal, saying that “We don’t talk about this publicly and we’re not going to; we don’t think it’s the right thing to do.” Hired in 2008, the former Google advertising chief joined the social network precisely to consolidate the company’s ad-based business model. Facebook’s then 20-something Mark Zuckerberg, who was reclusive and struggling with investors, brought Sandberg on to be the mature face of the company.

Analysts are still in disagreement over the immediate financial future of Facebook, whose stock price took a sharp dip after the harrowing news about personal data leaks. On Wednesday, Sandberg sold 163,500 shares of Facebook stock for a total value of just over $23,000,000. Over the course of 2017, Sandberg sold $316 million worth of shares, with over half that amount being sold in the first half of the year, according to CNBC. Sandberg has sold shares on a consistent basis over the past several years, yet the future of the company remains uncertain in light of dramatic changes and controversies.

A report from CNBC on April 10 highlighted a claim from Brian Wieser, a senior research analyst at Pivotal Research Group, that predicted a role shift for either Sandberg or Zuckerberg. “The company is not well managed,” said Wieser, also claiming that “one of Zuckerberg or Sandberg will not be in the same jobs in 12 months time.”

Most recently, Facebook has seen a modest uptick in active users, as it was reported April 25 that “Facebook’s daily active users in North America rose slightly last quarter to 185 million, a sign that the company’s News Feed algorithm tweaks and data privacy issues may not have deterred consumers.” This news may signal that the public is relenting to Facebook’s conduct; however, it may be worthy for these users to note that Facebook has declined an invitation to offer testimony at the upcoming “Examining Social Media Filtering Practices and their Effect on Free Speech” House of Representatives hearing that will discuss “what metrics social media platforms use to moderate content, how filtering decisions are made, and whether viewpoints have been silenced on some of the most popular and widely used platforms.”

Trust Lost: How Social Media Users’ Data Should Be Protected

Over the past few days, there has been public outrage over the way Facebook is handling personal data. This was brought to light by the recent scandal with Cambridge Analytica, but really should come as no surprise as Facebook has been treating its users this way since it launched back in 2004.

What Happened with Cambridge Analytica

The story that broke over the past few days is really just a piece of a much larger issue with Facebook and how they handle personal data. To summarize, a developer named Aleksandr Kogan developed an application in 2014 offering a personality quiz to Facebook users. About 270,000 users took the quiz, but in doing so they granted Kogan’s app access to not only their Facebook data, but the data of ALL of their Facebook friends as well— meaning the app now had data on 50 million users. Kogan then provided this data to Cambridge Analytica, who used it to create over 30 million psychographic profiles about potential voters.

Facebook is at fault for a major data breach because it failed to protect the personally identifiable information of its users.

1) Data Policy

Up until 2014, Facebook’s policy allowed for an application developer to ask permission from Facebook users to access their data. However, it also allowed the apps to collect that same data about ALL of that user’s friends on Facebook, without consent. Facebook changed this policy in 2014 to ensure that apps could not collect data on user’s friends, but at that point, the damage had been done.

An ex-Facebook employee on the privacy team stated, “At a company that was deeply concerned about protecting its users, this situation would have been met with a robust effort to cut off developers who were making questionable use of data. But when I was at Facebook, the typical reaction I recall looked like this: try to put any negative press coverage to bed as quickly as possible, with no sincere efforts to put safeguards in place or to identify and stop abusive developers. When I proposed a deeper audit of developers’ use of Facebook’s data, one executive asked me, ‘Do you really want to see what you’ll find?’”

2) Data Collection

The crux of the issue lies in the data that Facebook requires from new users in the first place. You are forced to provide your first name, last name, email or phone number, birthday and gender. The reason is simple – this information is a marketer’s (or a politician’s) targeting dream, and also the key to Facebook’s revenue and entire business model.

Facebook, however, does share blame with its users because when you sign up, you are agreeing to hand over rights regarding your personal information, so there is the level of consent. However, Facebook further deceives and confuses their users into thinking their information will not be disclosed by providing layers of permissions and privacy settings. If users were more aware about how much of their personal data was actually public (or being shared with government), they would be much more reserved in giving it away. Your personal data on Facebook is not private, and Zuckerberg has known this since the beginning. Obviously he has attempted to apologize for the below comments, but he has demonstrated no tangible actions to actually address the public concern.

How can you protect YOUR data?

1) Allow users to be anonymous if they want, untracked and free from surveillance and spying.

Anonymity means that site data is de-identified and not traceable to a person. As a result nearly all users data is public by default.

2) Maintain zero-knowledge on sensitive data.

This is essential in ensuring that users can chat freely with each other without the concern that the conversation is being monitored by anyone including Minds. All sensitive data on Minds is encrypted end to end whether in motion or at rest and original content is the property of the user.

3) 100% free and open source for public accountability and inspection.

Unlike top proprietary social networks, social media should be open source.  You must be able to inspect code and even help contribute and build the network. This provides much needed community ownership and transparency into what the platform is actually doing, as opposed to simply taking their word for it.

This debate exposes the paradox between transparency and privacy, both of which are core principles of Internet freedom pioneers. Facebook has gotten themselves into a deadly trap by pretending they are giving people privacy with layered permissions levels and supposed ‘privacy’ settings while also exposing massive amounts of data without consent. They are handing over data to the highest bidders and the user has lost all control.

Social Networks must give that control back to the people. This is only the beginning of the privacy movement and we all must join together for a better future for everyone.  

About the author:
Bill Ottman is an American internet entrepreneur, freedom of information activist and hacker based in New York City, best known as the CEO and co-founder of Minds, an open-source social networking service. He is a graduate of the University of Vermont, and co-founded Minds with John Ottman and Mark Harding in 2011.