“I believe we need a more active role for governments and regulators. By updating the rules for the internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.”
Believe it or not, it was Mark Zuckerberg who wrote those words, calling for external oversight of Facebook and other social media giants.
In a column for the Washington Post last weekend, the Facebook founder wrote about the need for regulation in four areas: harmful content, election integrity, privacy and data portability. The latter is the idea that internet users should be able to transfer their data from one service to another, and not have it be locked down by a particular platform.
Many have questioned the motives behind Zuckerberg’s about-face, wondering if it’s a public relations stunt to shift perception following a series of crises and controversies involving the company.
But other critics encourage reading between the lines.
While privacy violations, hate speech and election meddling are undeniably concerns when it comes to the influence Facebook has on our lives, some experts argue the root cause of those problems hasn’t been addressed — and won’t be through tighter internet regulation alone.
“We need to look at what’s not in this missive, which is market power,” said Prof. Dwayne Winseck of Carleton University’s School of Journalism and Communication.
‘Restraining dominant market power’
Winseck teaches a course on internet governance. He says with its behemoth scale and singular control over the data of its users, Facebook is a “dataopoly.”
A company with a monopoly in a traditional, non-digital industry is able to charge consumers higher prices for goods or services due to the lack of competition. In the case of a dataopoly, the results of that unrivalled power can be less privacy, degraded quality of service, and political and social consequences, writes Prof. Maurice Stucke, an antitrust expert at the University of Tennessee College of Law.
With more than two billion users who have few, if any, alternatives to the massive social network and its various platforms — which also include Instagram and WhatsApp — there is little incentive for Facebook to change the way it does business.
Winseck says this is clear in the company’s “take-it-or-leave-it terms of service.” Even if a user is uncomfortable with some of the Facebook’s practices, if they want to use the social network, they have no choice but to grin and bear it.
And in addition to its enormous reach, Facebook is a closed system. It controls everything from the service offered to users, to the data that they in turn generate, to the revenue that comes from then selling ad space based on that data, wherein those users can be precisely targeted based on their behaviour on the platform.
Winseck says to truly resolve the issues raised by Zuckerberg, “we have to start talking about the idea of break-up and restraining dominant market power.”
A campaigner from a political pressure group protests at a meeting on fake news held by the British Parliament’s digital, culture, media and sport committee last November. (Toby Melville/Reuters)
This isn’t the first time Zuckerberg has appeared to publicly sidestep pressing issues.
When he testified before Congress following the Cambridge Analytica scandal, he emphasized the tools available to users to adjust their privacy settings and control how much of what they post can be seen by others. But that didn’t address the real issue being discussed, which was how all that user data is collected and collated and then used for marketing purposes.
So why has Zuckerberg suddenly changed his views on external regulation? It could be the fact that the scrutiny over data collection and misuse was just the beginning of a mounting pile of criticism. More recent controversies have included Facebook’s role in the spread of anti-vaccine disinformation, and its seeming inability to rein in white supremacist hate speech.
After the recent deadly mosque attacks in New Zealand — part of which the gunman live streamed on Facebook — there’s been increased pressure on the company to be more accountable for its role in the spread of harmful content.
As New Zealand Prime Minister Jacinda Ardern put it, “They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”
‘Break up that monolith’
Up to now, Facebook’s response to criticism or crises has been largely reactive. To deal with hate speech and foreign political meddling, the company’s strategy has been to implement different internal measures. These often involve deploying more human oversight — not just relying on algorithms to catch harmful content, but having a team of people to intervene as well.
The trouble is — and this is true of all of the big tech companies, not just Facebook — without outside regulation, there’s no one really holding the company accountable. Any effort to remedy the concerns of users or the larger public is ultimately offset by what’s best for the company’s bottom line, says Frank Pasquale, a law professor at the University of Maryland and author of The Black Box Society.
While Pasquale applauds the sentiments expressed in Zuckerberg’s column, he’s waiting to see whether the company actually changes its ways. Does Zuckerberg’s call for improved privacy protection get put into action?
“He is saying all the right things about corporate social responsibility, while an army of Facebook lobbyists, lawyers and fixers continues to undermine regulation,” Pasquale said.
Break up that monolith, and then regulators will have a much better chance at effectively policing the resulting companies.– Frank Pasquale , law professor, University of Maryland
He says Facebook’s public credibility is at an all-time low.
“This is an effort to regain some goodwill.”
For any regulation to ultimately be successful, Pasquale says, what’s needed is an “all-hands-on-deck approach,” wherein laws that address privacy and harmful content are accompanied by antitrust.
“A firm encompassing Facebook, Instagram and WhatsApp is simply too big,” he said. “Break up that monolith, and then regulators will have a much better chance at effectively policing the resulting companies.”