Podcast: Data Privacy Before and After a Pandemic

Transcript

Listen: Some economists have argued that data is the new oil, a precious commodity driving exponential growth of some of the biggest multinational corporations. This week, our guest says it could also be the new CO2, quietly changing the world in irreparable ways if not properly controlled. Marietje Schaake, former EU Parliament Member and international policy director of Stanford's Cyber Policy Center, argues that more regulation is necessary to curb unchecked use of consumer data. Taped just days before many US cities entered lockdown in the COVID-19 pandemic, the interview also examines early uses of tracking and surveillance in Singapore and China, and what those actions foreshadow for the US as the nation balances freedom and security.

Subscribe to the GZERO World Podcast on Apple Podcasts, Spotify, Stitcher, or your preferred podcast platform, to receive new episodes as soon as they're published.

TRANSCRIPT: Data Privacy Before and After a Pandemic

Marietje Schaake:

There is a price. If you do not have checks on power, if you do not have the rule of law guiding throughout the society, whether it's online or offline.

Ian Bremmer:

Hello and welcome to the GZERO World Podcast. Here you'll find extended versions of the interviews from my show on public television. I'm Ian Bremmer, and today I focus on data, privacy, and how coronavirus could exacerbate threats to them both. To help me is a woman Politico named one of the savviest legislators on digital issues, former EU parliament member Marietje Schaake. Let's begin.

Announcer:

This episode of the GZERO World Podcast was made possible by Lennar, America's largest and most innovative home builder, and the number one destination for foreign residential real estate investment in the US. Learn more at www.lennargzero.com. That's lennargzero.com.

Ian Bremmer:

Marietje Schaake is the international policy director of Stanford University's Cyber Policy Center. Welcome.

Marietje Schaake:

Thank you.

Ian Bremmer:

... to GZERO World. We're talking today about data and the important fight that's happening both among inside societies as well as internationally. And to get into that, explain why it's become so fashionable to say data is the new oil. Explain what data means in terms of power and wealth in society today.

Marietje Schaake:

Well, data is a raw ingredient that you can do so many things with. You can deploy it to common causes, to the public interest, to the public good, to make, for example, government more accountable or to create more transparency, more insights into processes that impact all people. But what we see more and more of is of course, the hoarding and the harvesting and the collecting of data by big companies, which then use it for all kinds of purposes, which benefits a handful of companies, but doesn't contribute to the common good. So it's good that you can really create all kinds of outcomes.

Ian Bremmer:

I think about globalization and the fact that that's created so much wealth, but has also created a lot of inequality. Is data doing the same thing?

Marietje Schaake:

Yes. Some people say data is the new oil. I've also heard data is the new CO2. And I think that's an interesting way to look at it. It's also causing a lot of damage.

Ian Bremmer:

So you mentioned that corporations are doing this hoarding. Are governments doing the hoarding too?

Marietje Schaake:

It depends. Intelligence services are obviously very interested in knowing more and more details about what people do and the ability to gain access to critical data for them at any moment that they wish. I think there's a huge variety in the way in which governments are using data, collecting data and deploying it for better governance as well, which is a way for better service delivery. But it much depends on the capabilities of governments, the level of innovation in countries, the vision that they have for their own service delivery. So I think you cannot really make a blanket statement about how governments use data. It also very much depends on their governance model, their philosophy. Are they a democracy where they believe that government serves the people and that government is accountable to the people? Or are we looking at authoritarian regimes that are using data for control, for top-down repression for their own interests?

Ian Bremmer:

So here in the United States, how would you grade the US government in terms of what they are and what they're not doing, both in terms of their orientation towards data, but also towards the corporations that have so much of it?

Marietje Schaake:

I would say generally, the belief in the United States that the private sector can solve most problems is very visible also in the way that government works. So governments rely on companies to provide elementary services, like the software that governments run on, the services that they deploy are often also privatized. So I would say that there's little ability in the bureaucracy of the US government to actually themselves work with data. And the general trend of more dependence by governments on companies is tangible from the development of the critical infrastructure, the protection of the critical infrastructure.

Marietje Schaake:

Look at the whole question of elections. It's an election year. People are very focused on the resilience and the security of election systems. It doesn't just depend on governments, it also depends on private companies that have been engaged. And question, I think is where is the responsibility and accountability when governments are providing services, while they rely on private companies? And are there weaknesses in that system that are unacceptable that have to be addressed? And what are the systemic changes for the role of government when so much is outsourced and so much reliance depends on the private companies that are using data of citizens in the public interest, but with a private profits year mark?

Ian Bremmer:

So deeply powerful companies in the United States and the US government that is relying on those companies, more than they should be in your view, for the provision of critical services for the average citizen.

Marietje Schaake:

I think this is a global trend, but in the United States, there has been more of a tradition of trusting the markets to solve problems and much less of a focus, much less trust, in government to provide the best outcomes. And I think with technology and with the space that's been given in this country to big tech companies, that balance has tipped even more in the direction of the big tech companies. And I think it leads to all kinds of questions of accountability, responsibility, and also the knowledge that governments even have at this point about their critical infrastructure, about the services that they're providing. And also a question of the profits going to the private sector, but the cost, if there's a problem, if there is a data breach, if there is a service that needs to be deployed in the common interest, is this possible? Is there enough knowledge? Is there enough money, or is it only the price that's being paid by the public when there's a problem?

Ian Bremmer:

If we think that the US government has been providing an awful lot of trust to the private sector to get things done, and these private sector companies are, they're very robust. They're now the largest in the US economy in many cases, the largest in the world, how have they done as caretakers of consumer data?

Marietje Schaake:

Depends, I think, who you're looking at.

Ian Bremmer:

Talking about the big ones. I'm talking about Google, Amazon, Facebook, Apple.

Marietje Schaake:

Yeah. So Google, Amazon, Facebook, they don't only collect data, but then they deploy it for their own services. And one of the big challenges is that we, the public, I work at a university now, so researchers, but also regulators, parliamentarians, have very little detailed insight into what they actually do with the data. Or data that they may purchase from data brokers that are another player in the market that we don't talk about so much, but that I think we should shed much more light on.

Ian Bremmer:

The Axioms, for example. These big, fairly unknown brands.

Marietje Schaake:

Exactly. Unknown brands.

Ian Bremmer:

Who exists to collect and sell data.

Marietje Schaake:

Exactly. And so when you look at-

Ian Bremmer:

Are they more concerning for you than the Facebooks and the Googles?

Marietje Schaake:

I think it's an ecosystem that we have to look very critically at. And if we look at the public interest, why would we expect that advertising companies that are in the business of selling ads and making money are going to serve the public interests? So I think we have to be very clear about what these companies really are. They may have narratives about online public square or connecting the world, or empowering people's voices.

Marietje Schaake:

But when you strip it all from the PR messages and look at what the business model is, it is gathering data and selling it to advertisers. And because the systems are becoming more and more sophisticated, and because the power of these companies to not only sell ads but shape the entire information ecosystem is so significant, we have to look at what their ability is to move markets or to move voters in massive numbers to determine outcomes. Now with public health concerns and questions around the coronavirus, what is their responsibility and what is the impact when disinformation about the virus scares conspiracies can rise up to search results or people's feeds on their social media platforms, what impact does it have on society? And I think they're only beginning, these big companies, to come to terms with their role and relation to democracy, to public health, to basically providing the architecture of our information information.

Ian Bremmer:

Because those are incidental to the business model you just described, which is about collecting and selling data.

Marietje Schaake:

Yeah. There are the outcomes that we've seen happening. We've seen outbreaks of measles that are medically inexplicable, but that may be explained by disinformation about the risks of vaccines going viral on these platforms. Whether it's Amazon, when books are recommended, or whether it's Google and people search for cures for their children's diseases, or whether it's Facebook, where some of these theories can be sold as ads, not only just to convince people, but maybe to sell things. If you want to sell tea and you attribute all kinds of magical powers to this tea that it would cure diseases, make you much more healthy, perhaps it just gains revenue. It can be as superficial as that. But I'm talking about the relation between the business models and the consequences for the public. And that is something that we need to be critical of.

Ian Bremmer:

So I think there are two separate issues here to disentangle. One is the question of what companies should be allowed to do with your data and what sort of regulations are required there. And the second is to what extent these companies are responsible for the things that happen to be on their platforms, placed by individuals and the like. Let's start with the first, because then we're getting at the entire architecture, including the brokers in between. If that's where they're making all of their money, and if you take it away, then those companies won't exist anymore, what is the approach? What's the appropriate approach to figure out how companies should be allowed? What kind of restrictions, if any, should be placed on a company to sell data and what kind of rights and individuals should have over their data in relation to that?

Marietje Schaake:

Yeah. So I think that there should be much more empowerment of the individual and protection of sensitive data, when we look at the whole balance between individuals, the public interest and companies. So in Europe, the rights of privacy is a fundamental right. I think it's really important in understanding where European regulation comes from and how it differs from US regulations. In the US, privacy is more of a consumer. And so people are given more choice, let's say here in the US, in terms of how they want to treat their own data, what they want to sign off-

Ian Bremmer:

So if they check a box that they don't actually read, then that-

Marietje Schaake:

Well, that's a global problem now because the power imbalance between an individual internet user that is presented with all these terms of service, which has been looked at by hundreds of thousands of views by lawyers and hours of work to make sure that the company has maximum legal protection versus the individual. I think we should ask ourselves, is this a fair proposition? Do people really know what they're signing off on? Especially children when they're presented with ticking a box, signing up the services, I think people have become quite aware of this with kids TikTok.

Ian Bremmer:

Adults aren't doing it, so kids obviously aren't doing it. Yeah.

Marietje Schaake:

No. And some of these terms of services, if you're supposed to read them, it will take you hours and hours and hours. Most of us are not legally trained, most of us don't have time for it. It's been made incredibly convenient to just tick a box, with very far reaching consequences also into the future. So the terms of service could be adjusted, they can be changed, data can be used in different ways than what was presented initially. So I think there has to be an empowerment of individuals, the protection of rights of individuals, vis-à-vis these big companies. That balance of power has to be going back to the individual, and to their role as citizens, not just as consumers. I think too much of this has become a matter of, well, if people want to give their data away, then that's their choice the consequences, well, tough luck. But if you look not only at the individual impact, but the societal impact, I think it there's a real urgency to restore the balance between public and private interests.

Ian Bremmer:

How would that change the actual experience that you and I might have if we were on Facebook? How would it be different if we were to put those rules into place?

Marietje Schaake:

We should look at restrictions like micro-targeting. So it's the use of certain data that can be gathered from your searches online, from the links that you click, from the friends that you have, from the places that you've been. And assembling of all that information is very valuable for advertisers. And I think the question is, do we want to see advertising against sensitive categories like age or race or gender? Think about political ads where we may want to ask this. Or, is it acceptable that if somebody has been searching for certain mental challenges that they have faced, that these search results can then be used to advertise medication against or other kinds of services against, like talking groups or suicide hotlines?

Ian Bremmer:

Or sell them to insurers?

Marietje Schaake:

All of that.

Ian Bremmer:

Who choose to change.

Marietje Schaake:

All of that. So what kind of categories of information do we want these ad platforms to be able to gather and sell to advertisers? Do we want to protect people in certain ways? Do we want to make sure that laws that apply offline, like non-discrimination, for example, are actually enforced online? I've seen research by academics that shows how easy it is to exclude certain ethnicities or ethnic groups, minority groups from advertisements, how to really target in a way that you can narrow it down to people's homes and blocks. So imagine how that can work out if you're trying to sway elections or if you're trying to sell an apartment and you wish to sell it only to a certain category of people. Offline, if we walk around the corner here in San Francisco and try to say, "This apartment is for rent, but not for purple people," you would never get away with that. That's not allowed. And so I think we have to start with applying non-discrimination, fair competition, human rights laws that we have offline also online.

Ian Bremmer:

Now, it certainly feels like the advances that are being made every day by these tech companies, the extraordinary explosion of data, the movement of chips into everything, the internet of things, is going a hell of a lot faster than any of this discussion.

Marietje Schaake:

It is.

Ian Bremmer:

So how much does that fill you with a sense of, no matter what you say, of sort of inevitability that this challenge is actually going to get a lot worse, that actually people's data and the surveillance state, whether from a government or from corporations, kind of the genie out of the bottle on that?

Marietje Schaake:

I agree that the genie out of the bottle and that we have a lot of catching up to do, but I think it is so fundamentally important that I still believe it is worth it. And one solution is to look at those fundamental principles that should be protected, like let's say non-discrimination, and then to empower regulators to assess whether the principles at stake. So in other words-

Ian Bremmer:

The question is not whether you think it's worth it, I know that. My question is, if you're being honest with yourself, do you think in five years, 10 years that we'll actually have more wood to chop, we're going to be farther behind on this issue than we are today?

Marietje Schaake:

Well, a few big steps can make a difference. So I think antitrust is a big topic to look at, and if there would be more competition, then things can change quickly. We've seen that happening in the past as well. I think we should be mindful of what it means for small medium size enterprises and how we continue to have a level playing field, opportunities for smaller companies and not just a few big actors that with their armies of lawyers will weather any storm. The GDPR, the General Data Protection Regulation law that came into force from the EU side, was really sold politically as a good tool to push back against the big American tech corporations. But in fact, it's only made their position more robust. It's actually made them stronger, and it's made it harder for smaller companies to comply.

Marietje Schaake:

So we shouldn't assume that every regulation hurts big tech, even though that's a narrative that I often hear. I think we have to be very targeted about what principles do we want to preserve. So do we think that there should be more independent oversight over all the different things that these companies are doing? I would say yes, and I think it's a very common element of a society. We do not trust car makers to say, "Guess what? Our cars are safe. They comply with CO2 rules. Your child is safe and our car. Trust us, we've made them." Nobody wants that. People want to see independent tests, they want to see labels, they want to have verifications that there's actually compliance with environmental standards.

Ian Bremmer:

And there's literally none of that.

Marietje Schaake:

There's none of that. And so I would say we begin to treat these companies the way we treat other industries, which is that they have to comply with laws that exist, that there has to be independent oversight, and that if they violate the rules, there has to be proportionate consequences. So some of the fines that exist are actually so small compared to the extraordinary amounts of revenues and profit that these companies make, that it doesn't even hurt them, even if it's a high fine compared to other fines that have been imposed.

Marietje Schaake:

So you have to really think about how the power of these companies relates to the public interest and the rule of law, and then make sure that those are better aligned, and it will be a catching up that will have to happen. But I see consumers asking more critical questions, I see talent asking more from the companies where they work in terms of values first, and in terms of not being a part of eroding democracy or supporting practices that are violating people's human rights.

Ian Bremmer:

Now, to get to the other piece of this, which is what about the responsibility of these companies for the information that is on their platform. And we've heard Mark Zuckerberg say things like, "Well, you don't go after phone companies for conversations that are held on their lines," and making the argument that these are not the same thing as a newspaper, for example. Where do you come down on that? Because clearly, we're talking about platforms that have literally billions of people on them. Massive amounts of posting. AI has not developed to the extent that you would trust it in terms of editing oversight function for a lot of them. What are the sensible ways to think about how to deal with the fake news problem, how to deal with the filter bubble, and forcing people into only seeing things that they already agree with, the measles issue that you already brought up. Even the panic that can stem from coronavirus on the basis of things that they read and see. How much responsibility can possibly accrue to these massive companies for all of that information?

Marietje Schaake:

Well, so they are actually in a sort of double role here. On the one hand, this exemption of liability for content is sort of the lifeline for these big platform companies because they are not now responsible for what is on their platforms. On the other hand, under public pressure, there is a lot that they're already doing. So with the coronavirus, there have been a number of interventions with terrorist content. There are a number of interventions, even if it's not strictly speaking, illegal content that we're talking about, but what they call harmful or borderline content-

Ian Bremmer:

Because people will vote with their feet because consumers will be angry with them if they don't.

Marietje Schaake:

And politicians threaten. So politicians basically push the companies to do more. The problem that I see there is that basically, the responsibility to actually regulate speech is being pushed onto these already very powerful platforms that are on the one hand, considered to be part of the problem because the notion of virality. So a message spreading within minutes to hundreds of thousands of people, sometimes millions of people in a very, very short period of time is unique to these platforms. And so the idea that they have a responsibility in avoiding the worst kind of content from spreading, I think makes a lot of sense. But to say on the one hand you have exemptions of liability, on the other hand, you are now responsible for solving the problems on your platform. It again, avoids this notion of independent oversight, independent research, which I think is what we need a lot more of.

Ian Bremmer:

So we've talked a lot about the private sector and the role of governments in regulating. Before we close, I want to talk a little bit more about the public sector and internationally as well.

Marietje Schaake:

Yes.

Ian Bremmer:

So we've seen, for example, with coronavirus, an extraordinary response from the Singaporean government that has probably done a better job of containing coronavirus in the early days than any other country out there. In part, through extraordinary surveillance, through being able to track any given individual who has tested positive, and you can know where they were and who they are, which on the one hand, provides market stabilization and confidence among people that live there and travel there. And on the other hand, is really scary in terms of the lack of privacy of anybody that lives in that society and what that can mean. Obviously, again, this is just going to get stronger and stronger, these capabilities. How do you think about that?

Marietje Schaake:

So there's always a trade-off, you could say. If you want to have maximum security, there's often minimum freedom.

Ian Bremmer:

We just need to implant a chip, and you'll be fine. Yeah, exactly.

Marietje Schaake:

Well, you can also arrest everybody so nobody can run a red sign, right? That's another way to look at it. But it's just not the kind of society that I think, let's focus on liberal democracy for a moment, that it should look like. And I think we don't know all of the effects of a surveying state. In China where the coronavirus originated, I think there is a real case to be looked at, whether the censorship and the systematic lack of freedom of expression and concern that people have to speak their minds may not have actually led to the spreading of the virus because-

Ian Bremmer:

In the early days.

Marietje Schaake:

Yeah. The concerned doctors were not heard, people did not feel free to speak out, information about the virus was being censored, also online. Until this day, discussions about the virus and how the authorities handled it are continuing to be censored and tracked. So I don't think the last word has been said about the benefits of a surveillance state when it comes to a virus and a pandemic. But in general, I think the question should really be what kind of society do we want to live in?

Marietje Schaake:

And even in liberal democracies where we tend to say freedom is the key, and security measures should not go at the expense of those freedoms, they should actually respect those freedoms should be bound by law, we have allowed a situation where actually a number of these companies and services are escaping the scrutiny and the eye of the law. And I think that that is a big problem.

Ian Bremmer:

There's absolutely a prize, but clearly the consolidation of data and the explosion of data has not just empowered private sector companies in the West, but also the governments in authoritarian countries that have the ability to use that data.

Marietje Schaake:

Oh, absolutely.

Ian Bremmer:

Again, in that regard, we can have a conversation about the Americans and the Europeans more effectively regulating, constraining corporations. But it's hard to have a conversation about what you do with the Chinese system, where the government is fully aligned to ensure that they use that data to nudge behavior into being more patriotic, both on the part of citizens as well as corporations that are acting there.

Marietje Schaake:

Yes. So the contrast is that in China, their governance model has been leading and the technology is the instrument. In democracies, there's been a reluctance to apply democratic governance over technology. And in some ways, the technology is leading governance. In any case, we don't have, let's say, a rules-based model for the digital world. Democracies are under pressure, human rights are under pressure, civil society is under pressure, and technology has not been a magical solution to strengthen democracy. So it has really shown us that we need to apply governance and the rule of law proactively. And if we want to have a chance, vis-à-vis these Chinese top-down controlled models, we have to show how it works in a democratic way and be sure that our societies are resilient, whether it's in the online domains where people live or offline.

Ian Bremmer:

Marietje Schaake, very nice to see you.

Marietje Schaake:

Thank you.

Ian Bremmer:

That's it for today's edition of the GZERO World Podcast. Like what you've heard? I hope so. Come check us out at gzeromedia.com and sign up for our newsletter, Signal.

Announcer:

This episode of the GZERO World Podcast was made possible by Lennar, America's largest and most innovative home builder, and the number one destination for foreign residential real estate investment in the US. Learn more at www.lennargzero.com. That's lennargzero.com.

Subscribe to the GZERO World Podcast on Apple Podcasts, Spotify, Stitcher, or your preferred podcast platform, to receive new episodes as soon as they're published.

More from GZERO Media

Members of the armed wing of Nelson Mandela's African National Congress line up waiting to vote in a military base north of Pretoria, on April 26, 1994.
REUTERS/Corinne Dufka

On April 27, 1994, Black South Africans went to the polls, marking an end to years of white minority rule and the institutionalized racial segregation known as apartheid. But the “rainbow nation” still faces many challenges, with racial equality and economic development remaining out of reach.

"Patriots" on Broadway: The story of Putin's rise to power | GZERO Reports

Putin was my mistake. Getting rid of him is my responsibility.” It’s clear by the time the character Boris Berezovsky utters that chilling line in the new Broadway play “Patriots” that any attempt to stop Russian President Vladimir Putin’s rise would be futile, perhaps even fatal. The show opened for a limited run in New York on April 22.

TITLE PLACEHOLDER | GZERO US Politics

Campus protests are a major story this week over the Israeli operation in Gaza and the Biden administration's support for it. These are leading to accusations of anti-Semitism on college campuses, and things like canceling college graduation ceremonies at several schools. Will this be an issue of the November elections?

The view Thursday night from inside the Columbia University campus gate at 116th Street and Amsterdam in New York City.
Alex Kliment

An agreement late Thursday night to continue talking, disagreeing, and protesting – without divesting or policing – came in stark contrast to the images of hundreds of students and professors being arrested on several other US college campuses on Thursday.

U.S. President Donald Trump speaks with Judge Amy Coney Barrett after she was sworn in as an associate justice of the U.S. Supreme Court on the South Lawn of the White House in Washington, U.S. October 26, 2020.
REUTERS/Jonathan Ernst

Some of the conservative justices (three of whom were appointed by Trump) expressed concern that allowing former presidents to be criminally prosecuted could present a burden to future commanders-in-chief.

A Palestinian woman inspects a house that was destroyed after an Israeli airstrike in Rafah, April 24, 2024.
Abed Rahim Khatib/Reuters

“We are afraid of what will happen in Rafah. The level of alert is very high,” Ibrahim Khraishi, the Palestinian ambassador to the United Nations, said Thursday.

Haiti's new interim Prime Minister Michel Patrick Boisvert holds a glass with a drink after a transitional council took power with the aim of returning stability to the country, where gang violence has caused chaos and misery, on the outskirts of Port-au-Prince, Haiti April 25, 2024.
REUTERS/Pedro Valtierra

Haiti’s Prime Minister Ariel Henry formally resigned on Thursday as a new transitional body charged with forming the country’s next government was sworn in.

U.S. Secretary of State Antony Blinken arrives at the Beijing Capital International Airport, in Beijing, China, April 25, 2024.
Mark Schiefelbein/Pool via REUTERS

US Secretary of State Antony Blinken brought up concerns over China's support for Russia with his counterpart Wang Yi in Beijing on Friday, before meeting with Chinese President Xi Jinping.

Flags from across the divide wave in the air over protests at Columbia University on Thursday, April 25, 2024.
Alex Kliment

Of the many complex, painful issues contributing to the tension stemming from the Oct. 7 Hamas massacre and the ongoing Israeli attacks in Gaza, dividing groups into two basic camps, pro-Israel and pro-Palestine, is only making this worse. GZERO Publisher Evan Solomon explains the need to solve this category problem.