We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
New AI toys spark privacy concerns for kids
Taylor Owen, professor at the Max Bell School of Public Policy at McGill University and director of its Centre for Media, Technology & Democracy, looks at a new phenomenon in the AI industry: interactive toys powered by AI. However, its interactivity function comes with a host of privacy concerns. According to Owen, it doesn't end there.
So, it's that time of year where I start thinking, admittedly far too late, about my holiday shopping. And because I have a ten-year-old child, this means that I am seeing a lot of ads for new kids’ toys. Kids have had interactive toys for decades. Remember Tickle Me Elmo?
But now these interactive toys are being powered by AI. For example, for $1500, you can buy your kid a Moxie robot. My name is Moxie. I am a new robot. What is your name? Moxie is sort of like a robotic best friend. When your kid talks to it, Moxie records those conversations and then uses technology powered by OpenAI to analyze those interactions and react back.
Embodied, the company that makes Moxie, says that this helps kids regulate their emotions, provides them with companionship, and boost their self-esteem. All of which sounds great, but toys like this should also give us pause. Let me explain. A toy like this comes with a whole host of privacy concerns. Moxie records video and audio of your child and then analyzes that data to create facial expression and user image data.
Now they say they don't store the audio and video recordings, but they do keep the metadata about your child's facial expressions and how they're interacting with the toy. Embodied says it's ultimately parents’ responsibility to ensure that their child isn't giving out personal data. But I don't know., that seems unlikely for a toy that's designed to be your child's digital best friend.
These types of privacy concerns, of course, aren't new. Home assistants like Amazon Alexa and other smart appliances also record and mine your data. And big tech companies aren't likely to move away from this kind of practice, as data collection is essential to their market power. It's pretty clear we're extending this collection practice into the lives of our children.
While privacy concerns with toys like these are well-established, there's another issue that I think requires some thought. How will toys like these affect childhood development? There's a chance these toys could become a powerful tool in helping kids learn and grow. Embodied claims that 71% of the kids that use Moxie saw improved social skills. But this also represents a pretty radical new frontier in childhood development.
What happens when kids are being socialized with robots instead of with other kids? It's often said that AI is going to transform our society, but this may not be a binary event. Sometimes the effect of AI is going to creep into our lives slowly. Kids toys, slowly but surely becoming agents, may be one way this happens.
I'm Taylor Owen and thanks for watching.
Europe's challenge to Facebook; Amazon home drones
Watch as Nicholas Thompson, editor-in-chief of WIRED, explains what's going on in technology news:
Would Facebook actually leave Europe? What's the deal?
The deal is that Europe has told Facebook it can no longer transfer data back and forth between the United States and Europe, because it's not secure from US Intelligence agencies. Facebook has said, "If we can't transfer data back and forth, we can't operate in Europe." My instinct, this will get resolved. There's too much at stake for both sides and there are all kinds of possible compromises.
An Amazon home drone. Why would I need that and are you concerned about privacy?
Amazon has just announced a new drone that flies with the camera room to room in your apartment, home, looking for disturbances. Why would you need it? If you're really worried about a burglar, worried about a raccoon. Why should you be scared about privacy? Because it will be filming all your stuff and maybe linking it to your Amazon account. My concern about it? Look, it's cool technology, but I'd much rather get a dog.
Technological Revolution & Surveillance in the COVID-19 Era
Are we in the middle of a technological revolution?
Yes? I feel like a technological revolution should feel more empowering and exciting. It should feel like something good as opposed to something catastrophic. But if you define it as a moment when there's a lot of technological change that will last for years or decades, yes. Think about the way that health, education, working from home are going to change. There are lots of inventions right now because of coronavirus that will stick with us.
With the need for increased surveillance, will microchipping become a thing?
Microchipping is where you put a little microchip inside your body and you can use it to scan yourself in, you can embed data in it, you can use near-field identification. But no, it's not going to become a thing because you can do all that in your phone. Put the microchip in your phone. Carry the phone in your pocket or put it in your watch. Putting it in your skin is unnecessary and kind of gross.
Marietje Schaake on Digital Data Rights
Marietje Schaake, former member of EU Parliament and international policy director of the Cyber Policy Center at Stanford University, discusses the regulation and oversight required to ensure that offline rights are protected in cyberspace as well, including the avoidance of microtargeting based on race, gender, or health status. In an interview with Ian Bremmer for GZERO World, she argues that fair competition, non-discrimination, and adherence to human rights laws are uneven and lacking in the online world.
Who is responsible for protecting personal and sensitive data? Who is liable? Do already powerful tech platforms have too much power?
Surveillance vs privacy during the COVID-19 pandemic
In an interview with Ian Bremmer for GZERO World, Marietje Schaake, former member of EU Parliament and international policy director of the Cyber Policy Center at Stanford University, discusses the tradeoff between security and freedom when it comes to data surveillance. In a wide-ranging conversation about data and big tech, taped just days before cities entered lockdown in the United States, Schaake addresses early steps taken in Singapore and China to curb the spread of COVID-19 using tracking tools.
The complete discussion is part of the latest episode of GZERO World which airs on US public television. Check local listings.