How Our Children are Growing Up with AI

A 2019 study conducted by DataChildFutures found that 46% of participating Italian households had AI-powered speakers, while 40% of toys were connected to the internet. More recent research suggests that by 2023 more than 275 million intelligent voice assistants, such as Amazon Echo or Google Home, will be installed in homes worldwide.

As younger generations grow up interacting with AI-enabled devices, more consideration should be given to the impact of this technology on children, their rights, and wellbeing.

We recommend this article:7 Ways To Keep Your Kids Interested In Science

Unlocking the potential of AI

AI-powered learning tools and approaches are often regarded as critical drivers of innovation in the education sector. Frequently recognized for its ability to improve the quality of learning and teaching, AI is being used to monitor students' level of knowledge and learning habits, such as rereading and task prioritization, and ultimately to provide a personalized approach to learning.

Knewton is one example of AI-enabled learning software that identifies knowledge gaps and curates education content in line with user needs. Algorithms are also behind Microsoft's Presentation Translator that provides real-time translation in 60 different languages as a presentation is being delivered. This software helps increase access to learning, in particular for students who have a hearing impairment. AI, though not always successfully, is also increasingly used to automate grading and feedback activities.

With such broad potential for use in the education system, forecasts by Global Market Insights suggest that the market value of AI in education will reach $20 billion by 2027.

In addition to education, AI is also advancing children's health. In recent years, progress in research on the role of AI in the early detection of autism, signs of depression from children's speech and rare genetic disorders has made headlines. There are also growing examples of the deployment of AI to ensure child safety by identifying online predators and practices such as grooming and child exploitation.

Challenges to child welfare

Despite the positive applications of AI, there is still a lot of hesitation towards the technology in certain regions. A 2019 survey conducted by IEEE revealed that 43% of US and 33% of UK millennial parents respectively would be comfortable leaving their children in the care of an AI-powered nurse during hospitalization. In contrast, millennial parents in China, India, and Brazil are more receptive to artificial intelligence where 88%, 83% and 63% respectively would be comfortable with a virtual nurse caring for their child in hospital. Similar findings were found for the use of AI-powered robots in pediatric surgery.

Skepticism on the widespread use of AI is also present in discussions on children's privacy and safety. Children's information including sensitive and biometric data is captured and processed by intelligent devices including virtual assistants and smart toys. In the wrong hands, such data could put children's safety at risk.

For example, amid security fears, in 2017 CloudPets teddy bears were withdrawn from the shelves following a data breach that exposed private information including photos and recordings of more than two million children's voice messages.

Serious concerns have also been raised over the use of children's data, such as juvenile records in AI systems, to predict future criminal behavior and recidivism. Other than posing a threat to privacy, civil society representatives and activists have warned against possible discrimination, bias and unfair treatment.

The path towards child-centred AI

To ensure that AI is child-centred, decision-makers and tech innovators must prioritize children's rights and wellbeing when designing and developing AI systems. UNICEF and OHCHR have been particularly vocal in this regard. As part of its AI for Children project, UNICEF has worked closely with the World Economic Forum to develop policy guidance on artificial intelligence for children featuring a set of recommendations for building AI policies and systems that, among other things, uphold children's rights to privacy and data protection.

As part of its Generation AI initiative and conversations on global standards for children and AI, the World Economic Forum is also spearheading the “Smart Toys Awards” to maximize the learning opportunities to smart toys and minimize the risks posed to safety and children.

Estimates suggest that, by 2065, 65% of children in primary school today will work in positions that have not yet been created. From a practical standpoint, AI should be incorporated into school curricula to equip future generations with coding skills and provide them with adequate AI training. At the same time, children should be taught to think critically about the technology and to inform their judgements about related threats and opportunities. Such efforts should be inclusive of all children and therefore should seek to bridge the digital literacy gap between the Global North and Global South.

More global action will be needed to ensure that children's best interests are reflected and implemented in national and international policies, design, and development of AI technologies. There is no doubt that artificial intelligence will change the way children interact with their surroundings including their learning, play and development environment. However, it is our responsibility to ensure that this change becomes a force for good.

If you like this post, please share it with your friends. And if you like our blog, you should check our online store here. Also, we invite you to follow at Instagram, Facebook, Twitter, and Pinterest