It seems that Facebook has finally run into a scandal shocking enough to make at least some people question the point of being on their social network. I’ve read quite a number of articles and op-eds this week decrying the company, its CEO and its business model. As I read these attacks on the company, all I could think about was why this latest breach of trust was a surprise to anyone? What Facebook did with Cambridge Analytica is only one example of the complete lack of responsibility that Facebook has over its users’ personal data. There have been many other breaches of trust by this company, and Mr. Zuckerberg has a long history of apologies for lapses in judgement and breaches of trust that go back even before Facebook was even started.
Along with the critiques of Facebook have come various opinions about how to fix the issues at the company. The suggestions range from technical changes to removing the CEO himself, a difficult but not impossible task. However, all of these remedies, even if implemented in their totality, cannot and will not fix this company. The reason for that statement is that Facebook’s very existence is based on collecting and sharing the personal information of its users, often in ways that even its founder and leadership team cannot imagine or control. This is nothing new, of course. Everyone knows the story of Facemash, the hot or not-esque site that Zuckerberg created at Harvard and that almost got him expelled. Shortly after its launch, his classmates were livid at its creation and the future CEO issued his first public apology. “I hope you understand,” he wrote, “this is not how I meant for things to go, and I apologize for any harm done as a result of my neglect to consider how quickly the site would spread and its consequences thereafter.”
The last words quoted were prophetic of course: my neglect to consider how quickly the site would spread and its consequences thereafter. Almost every other major scandal at Facebook can be seen as a variation of this same lack of foresight and understanding of (or caring about) consequences. The routine is familiar by now. Facebook makes an “innovation” that results in harm to its users’ privacy or even to a nation’s institutions. At first the company denies that anything is wrong, until it is shown incontrovertible proof, after which it issues (sometimes faux-emotional) apologies and promises to do better. But it can’t do better, because to do so would mean that the core value creation mechanism of the company would be denied. As one astute analyst noted:
The main issue cuts to the core of the company itself: Rather than “building global community,” as founder Mark Zuckerberg sees Facebook’s mission, it is “ripping apart the social fabric.” Those are the words of Chamath Palihapitiya, the company’s former vice president of user growth. He doesn’t allow his kids to use Facebook because he doesn’t want them to become slaves to “short-term, dopamine-driven feedback loops.”
That is the irony of Zuckerberg’s creation. Like so many others before him, the thing he set out to create has become its opposite. His platform for uniting communities has become the thing tearing them apart. As a NY Times piece recently stated:
Consumers’ seemingly benign activities — their likes — could be used to covertly categorize and influence their behavior. And not just by unknown third parties. Facebook itself has worked directly with presidential campaigns on ad targeting, describing its services in a company case study as “influencing voters.”
“People are upset that their data may have been used to secretly influence 2016 voters,” said Alessandro Acquisti, a professor of information technology and public policy at Carnegie Mellon University. “If your personal information can help sway elections, which affects everyone’s life and societal well-being, maybe privacy does matter after all.”
I wrote once before about something I call the Linn Effect, which is when “technology solves one problem but creates the same or more problems at the same time.” I believe that the Linn Effect is universal and Facebook is yet another example of its manifestation. Eventually someone may create a social network that does not share Facebook’s fundamental design and business model flaws. Until that day comes, I will stay off Facebook and so will my kids. No technical or policy changes will fix this company or platform. There will be more breaches and more problems. Anyone who stays on Facebook must ask whether any gain it provides is worth turning over all your most private data to the world. For me, the answer is a resounding no.