Scroll to the top

How to fix Facebook

How to fix Facebook
How To Fix Facebook | Quick Take | GZERO Media

Ian Bremmer's Quick Take:

Hey everybody and happy Monday. Back in the office, getting a little cool. So I've got my sweater going on. It's the first time I've had a sweater on. What do you do with that? Discussing fashion, as I talk to you about what is on my mind this week?

And what's on my mind this week, Facebook. Facebook is on my mind. It's a tough week for Facebook. There are all sorts of whistleblowers out there. There's testimony going on. There's calls for regulation. Everybody seems unhappy with them. Indeed, you even got the government relations types, Nick Clegg, who I've known for a long time back when he was a policymaker in the UK saying that the headlines are going to be rough, but we're are going to get through it. But I will say, first of all, I'm kind of skeptical that any of this goes anywhere in terms of impact on how Facebook actually operates.


I mean, if there's anything that's a real threat to Facebook going forward, it's whether or not people themselves, the consumers start to opt out. Is it a place for young people around the world or is it really just folks like me who are engaged in posting and all of that kind of thing. But why do I feel that way? Well, one, because government does not in any way agree on what to do.

I mean, on the right, you have a lot of people who think the problem is about culture warriors. It's political incorrectness not being allowed. It's people on the right being taken off unnecessarily. So of course, that starts with Trump. It goes farther with others. It is certainly true that the most viewed sites on Facebook continue to be people like Ben Shapiro and Dan Bongino and Fox News. It's not consistent every day, but it is absolutely majority. So it's hard to make that argument overall, but certainly in terms of individual people that are getting canceled and folks that are seen as putting forward, what is described as fake news and disinformation on Facebook, whether it's around the elections or around white nationalism and supremacy, or even around vaccines and the pandemic response, there has been more sensitivity on both the actions taken and the responses from the right than from the left.

On the other hand, on the left, you have people who are saying there's far too much power. This is bad for civil society. It's driving people towards extremes. It promoted all sorts of stop-the-steal behavior and promoted violence as a consequence. And so, it's a very different set of what... When the country is as tribal as it is, the responses to where you see a lot of that tribalism is very, very different.

Secondly, the company of course itself is not interested in taking on principal responsibility for solving the challenges of regulation. I mean, companies always say they'd rather functionally regulate themselves, but they don't want to have direct responsibility for that because that implies direct accountability with the population for that. So in other words, this reminds me of what you used to hear from China 10, 20 years ago, which is, "hey, we're small. We are poor. We're weak. Don't look to us for the global solutions, look to the United States for the global solutions." Facebook is kind of doing the same thing. I mean, I don't know if you read Axios this morning, but Facebook's sponsoring it. And they're basically saying, hey, we want regulation. We want the government to tell us there are problems with the kind of news that's on our and other sites. And we want the government to create new sets of rules that will apply across the entire internet, across all social media that will determine how we should function, what kind of information we should post, what we should not post. They're asking for it. And in part they're asking for it because as they know they're not really going to get it, but in part they're asking for it because it means not their responsibility. If it screws up, it's on somebody else. It's on the US government. Furthermore, Facebook, like most of these AI-driven organizations, don't really know what their algos, what the algorithms really do. And that's a problem of AI. You've got deep learning into massive amounts of big data. And you understand that it's getting you outcomes that drive more engagement, but you don't really know exactly what it's doing.

It's not telling you. I mean, you can figure out what patterns it's getting information from, but that's very different from having a human being sit down and explain, okay, here's why we're getting more engagement. Here is the strategy and the logic behind it. I mean, when you are programming algorithms to look inside data, to drive more engagement, you will get that outcome. And it's not an effort to polarize. It's not an effort not to polarize either. It's just an effort to drive more engagement. And if the companies themselves don't really know what the algos do, then it's very hard for them themselves to say, "well, here's what we would do if we wanted to ensure that civil society was stronger." Because that's not what you're optimizing for. You're optimizing for the business model itself.

Then of course you have the point that it's the United States versus China in terms of the supremacy of different technological capabilities of which Facebook is one. And if you're weakening Facebook, if you're breaking up Facebook, if you're regulating Facebook in a way that fundamentally subverts their business model, while in China with many more citizens and far more data, because there's no real privacy consolidated in super apps, well the Chinese companies are going to become more successful. They'll win. And if these companies are increasingly meant to be a big component of what national security means and how one competes on the global stage, the worst thing you can do is undermine American companies at the expense of their competitiveness, vis-a-vis China.

So I think all of these things together are reasons why it is unlikely that we are going to see structural regulation that will meaningfully undermine the power of organizations like Facebook to have more and more influence over the areas that they play. And in the case of Facebook, it really is social interaction, information, and news that the average person around the world on the platform, 3 billion people at this point are digesting.

I do think that there are some... As Ian Bremmer here, there are some obvious fixes that would reduce the level of the problem. I mean, fix number one seems pretty clear to me, that choosing not to run political ads of any sort would improve the level of information and the quality of discourse with US elections. That's number one. Secondly, systematically reducing the importance of domestic politics, or heck, even all politics on the site so that people who are going to Facebook are not being fed primarily that kind of information. That runs against my interest, frankly, but nonetheless, I think it would probably help.

And third, my favorite one, everyone should be verified. Every person that is on the site should actually be a real person like on LinkedIn, for example. They have to sign in and verify who they are. If they break the terms of agreement, then that means that they lose that. And they can't just come up with another random anonymous account. Now, the problem with all three of these fixes is that they all would actually in different ways undermine the business model. You will make less money if you don't take political ads, if you don't run them. You will make less money if something that's very popular, drives a lot of engagement, like politics is reduced in its prevalence on the site. You will make less money if you get rid of all of the anonymous accounts and the bots and the fake trolls, because they drive engagement, they drive a lot of engagement.

So there are very legitimate reasons that a shareholder-driven company would not take the steps to make those sorts of fixes. And there's also lots of reasons that I mentioned before, why the US government will not put the kind of regulations in place that would lead to fixes like that. So what does that mean? What's going to happen? What's going to happen is that we are going to need to adapt to an environment where technology companies are increasingly powerful in various digital spaces.

This reminds me of when I first came up with the idea of the "G-Zero World," a world without global leadership almost 10 years ago now, and immediately the response I got from people is, "okay, Ian, well, that's bad. So how do we stop it from happening?" And I was like, "well, what do you mean? How do we stop it from happening?" I'm telling you, I think it's going to happen and it's going to happen because it's overdetermined, because the United States increasingly doesn't want to be the global policeman or architect of global trade for reasons that are deep and structural. And the Europeans are more divided themselves and less capable and willing of providing that kind of leadership in the absence of the US. And the Russians are in decline, but angry at the Americans and the Europeans. They want to further undermine those countries and their ability and willingness to provide that leadership. And China's becoming stronger, but they are not aligned with the political and economic models of the US and Europe.

So it's not how you stop the "G-Zero" from coming. It's given that the "G-Zero" is coming, what do you do about it? How do you respond to it? How do you adapt to it? It's like climate change. We started 27 years ago, the COP process and the people involved would not even talk about adaptation because that was tantamount to surrender. If you said you were going to adapt to climate change, that meant that you were refusing, you were abdicating responsibility for a world that we had to stop climate change. And yet, the reasons that climate change were not going to be stopped were so incredibly overdetermined. So entrenched among many actors across the entire world, that it should have been obvious that we were heading towards one, two degrees, increasingly three degrees centigrade of warming. And it's a horrible thing for the environment, but we need to adapt to it. Doesn't mean that we stopped trying to mitigate the consequences themselves, of course, but you can't refuse to adapt. Adaptation has to be a big component of how you respond. And I think when we talk about Facebook, when we talk about technology companies more broadly, adaptation is increasingly a core part of the model in part because it's happening a lot faster than climate change. And for reasons that I've argued, I really don't think it is sensible to presume that we're going to be able to fix this in the near term future.

That's enough for me. Hope everyone's good. Talk to everyone soon.

GZEROMEDIA

Subscribe to GZERO's daily newsletter