One result of the law enforcement crackdown on pro-Trump Capitol rioters following the events of January 6 is that many right-wing extremists have left public social media platforms like Facebook and Twitter for encrypted apps like Telegram and Signal. But renowned tech journalist Kara Swisher isn't all that concerned. "The white supremacist stuff, it's like mold. They thrived in the light, actually." Now that these groups no longer have such public platforms, their recruiting power, Swisher argues, will be greatly diminished. Plus, she points out, they were already on those encrypted apps to begin with. Swisher's conversation with Ian Bremmer was part of the latest episode of GZERO World.
Renowned tech journalist Kara Swisher has no doubt that social media companies bear responsibility for the January 6th pro-Trump riots at the Capitol and will likely be complicit in the civil unrest that may continue well into Biden's presidency. It's no surprise, she argues, that the online rage that platforms like Facebook and Twitter intentionally foment translated into real-life violence. But if Silicon Valley's current role in our national discourse is untenable, how can the US government rein it in? That, it turns out, is a bit more complicated. Swisher joins Ian Bremmer on GZERO World.
Subscribe to GZERO Media's newsletter: Signal
Watch as Nicholas Thompson, editor-in-chief of WIRED, explains what's going on in technology news:
What is Parler? Why are people moving off Facebook to new social sites?
Parler is like Twitter, except it was set up very specifically to make it so that the owners of the site, the people who run it, would not censor your speech, or put another way, would not take action to remove hateful or harmful speech. It is a free speech social media platform that is primarily used by people on the political right. Why are people moving off Facebook to new social sites? I don't think that many are. People talk about moving off, but to the extent they are, it's because they feel like the sites are censoring them.
Nicholas Thompson, editor-in-chief of WIRED, discusses technology industry news today:
Do some of the Facebook's best features, like the newsfeed algorithm or groups, make removing hate speech from the platform impossible?
No, they do not. But what they do do is make it a lot easier for hate speech to spread. A fundamental problem with Facebook are the incentives in the newsfeed algorithm and the structure of groups make it harder for Facebook to remove hate speech.