The growth of Internet of Things (IoT) isn’t limited to self-driving cars and home security systems. AI toys can already be found in over one-third of households with children, and demand for IoT products is expected to double by 2021. Children’s toy developers have integrated AI into products to interact with children on deeper and more engaging levels. The types of toys range from Star Wars BB8 droids and Barbie dolls to the new Amazon Echo Dot for kids, a virtual home assistant targeted for a younger audience. Incorporating AI into a toy can increase its educational potential by allowing for more personalized content. However, the product’s AI must collect information about its user in order to talk, sing, dance, or play on a personal level with a child. How companies collect, store, and share this data matters from both an ethical and regulatory standpoint.
In a humorous incident, a six-year-old used her family’s Echo Dot to ask for a dollhouse and cookies. To the surprise of her parents, a large dollhouse worth $160 and four pounds of sugar cookies arrived at her Texas home several days later. While this incident was relatively amusing, it reminded consumers how much information smart devices collect from everyone in a home. A child’s personal information that a toy collects to optimize the user’s experience could potentially be sold to third parties or even hacked. Not only is it important from a legal standpoint to carefully decide and disclose what information is collected about a child, but it can also affect the relationship between the purchaser and the company.
Regulations for AI Toys
Enforcing rigorous data security and ensuring a product does not collect more data than a parent would be comfortable with are a few ways a company can build trust with consumers. A study by the FTC shows that children often cannot distinguish when they are or aren’t being recorded, so it becomes the responsibility of the manufacturer and parents who purchase such toys to safeguard a child’s privacy. Companies like Mattel and VTech have suffered backlash for data privacy issues and hacks on IoT toys. This year, the FTC fined VTech for failing to properly encode data and violating privacy laws. Mattel’s “Hello Barbie” did poorly on the market and received negative reviews because parents found the product to be too intrusive.
In 2013, the FTC updated the COPPA (Children’s Online Privacy Protection Act) to address the internet’s growth and its changing issues. COPPA applies to children under the age of 13 for any product or service that connects to the internet. Under this rule, the following are true:
- Organizations need to outline their policies about how the data they collect will be used.
- Parents must be able to “prevent further use or online collection of a child’s personal information.”
- Personal information includes visual and audio recordings, name and address, contact information, and “persistent identifiers” to track a child’s identity online. This type of data can only be kept as long as the toy needs it to function, and then it must be deleted as preventive measure.
The Current Landscape of AI Toys
Mattel decided not to release a virtual assistant for children called “Aristotle” due to concerns about privacy and child development. 15,000 people signed a petition calling for the company to stop its development. Pediatrician Jennifer Radesky had concerns about products like Aristotle becoming the primary caregiver for a child instead of another person. Aristotle was designed to serve as a virtual assistant with the ability to change its functions throughout a child’s development, from soothing a crying baby to helping a middle school student complete their homework. Sen. Edward J. Markey and Rep. Joe Barton both had asked the company for more details about how information from the device would be stored.
Another concern parents may have about bringing an IoT toy into their home regards what specifically the toy could be saying to their child. Creators of Amazon’s Echo Dot for kids plan to incorporate branded partnerships into the program’s functions. Some worry that this partnerships with Disney and Nickelodeon could blur the lines between content and advertising for a population that may not be able to understand the difference.
In addition to data collection, privacy, and advertising messages, consumers may also worry about the effects of how increased interactions with technology could shape a child’s cognitive and emotional development. The creators of Cozmo, a robot toy for children, said their goal was to have children “create a deep emotional connection” with the product. Companies should be careful that a toy’s program does not inadvertently manipulate a child emotionally or stunt his or her real emphatic growth. The good news is that a study by the FTC shows that children still would rather play with a real friend than with an AI toy.
Even after considering concerns about child development and privacy and data collection, companies can still make engaging and fun toys for children using AI. CogniToys’ “Dino” toy serves as an example of an AI product for children done right. By incorporating IBM’s Watson AI, the toy becomes smart enough to have an education value. CogniToys says that its data is only used internally, and it also comes with an online portal where parents can check account settings, learn how the toy is being played with, and even reset the toy and delete all the information it collected. While there are potential issues to address when developing a toy with AI for children, it’s certainly possible to create a product that children will enjoy and parents will want to purchase.
To learn more about Clarkston’s perspective on emerging use of AI in consumer products companies and how it will impact your operations, subscribe to our insights below or contact us.
Coauthor and contributions by Sabrina Zirkle