The Internet of Toys raises new privacy and security concerns for families
Loading...
If you're a parent buying a talking toy for your kids, you probably wouldn't want a hacker using it as a way to talk to them alone in their bedrooms. Nor would you want hackers using their toys as a way to collect sensitive personal information about them.
But that's the risk parents must consider – but may not even be aware of – with the rise of the Internet of Toys.
Dedicated hackers – both ethical and nefarious – have proven they can take advantage of internet-connected toys that don't have adequate cybersecurity measures in place. That's raised new concerns from security and privacy advocates who say toymakers and tech companies need to do more to ensure that kids are properly protected now that Wi-Fi enabled toys are common playthings.
While there's a long way to go, awareness of the security risks is a first step. Passcode, along with the Family Online Safety Institute and the Future of Privacy Forum, hosted an event this week to discuss how better to protect your kids and increasingly connected homes. You can watch the full video here, and here are five things we learned:
1. If your toy is hackable, your home may be, too.
“The power of connected devices is also, in some ways, their greatest weakness,” says Julie Brill, a partner at law firm Hogan Lovells who was until recently a Federal Trade Commissioner.
Devices can use Wi-Fi to “talk” to each other, Ms. Brill points out, but those networks are only as strong as their weakest links. If hackers can get access to a toy, they could leverage it to compromise an entire network of connected devices in a person’s home. To help solve this problem in the future, she says, it’s possible people’s homes could have a type of “command center” in which consumers can find out how their devices interconnect – and insert their own privacy preferences.
2. Toys travel with kids. So do the privacy risks.
Parents might say they’d never personally choose to buy a certain toy if it was too risky from a security or privacy standpoint. However, notes Emily McReynolds, a program director at the University of Washington’s Tech Policy Lab, children bring toys to other people’s houses. So even the most privacy-conscious parent might find a connected toy on their home Wi-Fi network, or interacting recording conversations with their child, even if they didn’t approve it.
The intimate access toys have to kids’ lives, and their portable nature, raises a whole host of questions about notification and consent, Ms. McReynolds says. “How do we help notify the parents of the second house, or the third house?” she says. “And where do you go for more information?”
3. Some experts want the government to consider some minimum security requirements.
Josh Corman, director of the Atlantic Council's Cyber Statecraft Initiative, wants some to see some sort of regulatory requirements for companies to implement to make their products more secure. As he puts it: "Some minimum hygiene things."
After all, he says, people aren't going to be experts in this stuff. But they shouldn't have to be.
"I don’t know how a commercial airline works or what questions to ask before I get on one. I just know I can trust it. Because it’s not a voluntary standard for minimum safety flight checks for aviation," he says. "There are some things in culture that are not optional.
“And I think our default posture has been, let’s not interfere in the free market of the software industry. The one thing you’re not liable for on the planet is software. There’s no software liability laws.... With privacy there’s been some strides there and I’m really interested to see if we can piggyback off some of those.”
4. There are some security-savvy connected toymakers taking precautions. Others may not know how.
Donald Coolidge, chief executive officer of Elemental Path, says his company – which manufactures the talking dinosaur Dino – takes security and privacy concerns seriously. Elemental Path encrypts information flowing both to and from the toy, he says, noting “that’s something other companies don’t do.” That said, “there’s always going to be ways to get into something,” Mr. Coolidge says. That’s why his company works to anonymize the data, ensure it’s stored in multiple different places, and has opened its doors to ethical hackers to test its product.
But many companies also want to do the right thing when it comes to security, says Dona Fraser, vice president of the ESRB Privacy Certified (EPC) program, which helps companies comply with their local data privacy protection laws. However, she notes, “whether they know what the right thing is, is another question.”
5. Privacy policies need to be transparent, especially for parents
If you're in a physical store, says Ms. Fraser, parents may be more concerned about whether their children should have another toy than the privacy implications. That’s why the privacy policies need to be as clear as possible.
“When you’re dealing with households like mine, where I have a niece and nephew who come out of the womb swiping right and left, to the grandfather who thinks a live stream is a wild river, there’s a huge gap in families where you have kids teaching adults,” Fraser says. “And they’re not teaching them about privacy they’re teaching them how to use a device.”