Probably no. The outage, massive and unprecedented as it was, turned out to be a technical glitch. A horrible day for Facebook IT, to be sure, and it produced some epic Tech-on-Tech trolling, but will it really give fresh legs to calls to "break up Big Tech"? That feels less likely today than it did yesterday. For one thing, bad as the outage was, its main victims were small businesses who depend on Instagram and Facebook to reach customers. Yes, they lost a day's worth of orders, but they are a widely-dispersed constituency of relatively small-fry economic players. What's more, Facebook's size and reach is in many ways precisely what they've signed up for. For them the outage is largely an IT screwup, not an antitrust issue.
Second, with the exception of Facebook itself, the outage didn't pose a major problem for the operations of large, powerful corporations with lots of lobbying power. If we'd seen a similar outage at a cloud services provider — say, a Microsoft or an Amazon Web Services — whose servers hum for some of the world's richest and most influential companies, we might be having a very different conversation right now. In the end, the Great Facebook Outage of 2021 could well be remembered as an embarrassing blip rather than a regulatory watershed. So long as it doesn't happen again.
Maybe yes. The Tuesday morning Capitol Hill testimony by former Facebook employee Frances Haugen was an altogether more serious challenge for the company. Over the weekend, Haugen revealed herself as the person who in recent weeks had leaked to the Wall Street Journal a damning trove of internal documents showing that Facebook was keenly aware of the harm that its products inflict on children but had chosen to place "profits over people." Facebook's own executives went to Capitol Hill to deny this charge last week, but they've already suspended development of two kid-focused products on their Messenger and Instagram platforms.
A critical point of Haugen's remarks before the Senate subcommittee on Tuesday concerned the need to tackle harmful content, not by playing endless whack-a-mole with individual posts, but on regulating the algorithms that are tuned to maximize profit by serving up the most toxic and addictive content. If the Department of Transportation has insight into the safety of our cars, she asked — why shouldn't the government have similar power over the safety of our algorithms?
Moves by lawmakers on that front would mark a significant step in tech regulation. Could there be bipartisan movement in that direction? Republican Senator John Thune has already sponsored bills to regulate algorithms, while his Democratic colleague Richard Blumenthal, who chaired the subcommittee hearing, pronounced it a "Big Tobacco moment" for Big Tech, invoking the regulatory kneecapping of the cigarette industry over the past 20 years.
Other near-term measures could include a strengthening of child privacy protections for social media. Facebook itself seems resigned to the likelihood of new rules: while the company cast doubts on Haugen's expertise, it did say it is "time for Congress to act" to create clearer rules of the road for the internet. (Naturally, Big Tech will seek to influence the writing of those rules, of course.)
But how far will any new bipartisan bonhomie on regulating Silicon Valley go? The need to protect kids is something most agree on, but there are still big divides between Republicans and Democrats about broader issues of "Big Tech regulation". Democrats tend to focus on the need to police misinformation, squelch hate speech, and — particularly among progressive lawmakers — curb the market power of Big Tech players. Republicans, meanwhile, have focused chiefly on Silicon Valley's perceived liberal bias and stifling of free speech.
Comments section. Haugen's testimony has concentrated lawmakers' minds, but it's unclear how far any new regulatory push will go. Big Tech has big problems, but lots of users and deep, deep pockets.