I’ve recently become the chauffeur for my son and his group of friends, as they go to for a weekly gaming afternoon/hangout at one boy’s house. It’s clear that my role as the driver is to be invisible – they talk and goof around with each other in the car as if I’m not there, and if I do interject in their conversation, there’s a moment when they all freeze, confused as to where this voice from above came from, before ignoring it and carrying on. I’m there to hover on the outside, not to get involved.
This year, it may not just be Santa Claus who sees your kids when they’re sleeping and knows when they’re awake: one of the hottest trends this season is so-called “smart toys”, which use the Internet to hold artificially intelligent conversations with kids while they play. Last year’s Hello Barbie, one of the first to use this technology, was found to have a number of major security flaws – including automatically connecting the mobile device to which it was tethered to any Wi-Fi network with “Barbie” in its name. Now two more toys, a doll called My Friend Cayla and the i-Que Intelligent Robot, have been found to collect data in ways that are far more worrying.
In 2015, MediaSmarts and PREVNet conducted a study of Canadian students – funded by TELUS – to find out how to give youth better advice and support when they witness cyberbullying. That research, Young Canadians’ Experiences with Online Bullying, aimed to discover three things: what are the barriers to witness intervention in cyberbullying? What incentives can increase the likelihood of witness intervention? And which interventions are more or less likely to have a positive outcome?
Over the last week our world has been invaded: cute cartoon creatures can now be found lurking in parks, restaurants, museums, and even people’s houses. If you haven’t seen them, it’s because they’re only visible on a smartphone screen, and only if you’re playing the new game “Pokémon Go”.