Algorithmic Awareness: Conversations with Young Canadians about Artificial Intelligence and Privacy

MediaSmarts conducted focus groups with youth ages 13 to 17 to gain insight into how young Canadians understand the relationships between artificial intelligence (AI), algorithms, privacy, and data protection. Participants played a game prototype designed by MediaSmarts’ education team, and a scaffolded learning experience allowed for in-depth discussion after each of the three phases of gameplay. These conversations highlight that while youth understand and appreciate the benefits of recommendation algorithms, they are troubled by algorithmic data collection and data sharing practices. This research is a call for more algorithmic literacy tools and resources that will give youth the knowledge they need to protect themselves and their information in digital spaces.

For More Information:

Key Findings and Recommendations

This research project, funded by the Office of the Privacy Commissioner of Canada, created space for young people to learn more about AI and algorithms and their repercussions on privacy rights. This project also allowed MediaSmarts to design a youth-friendly educational game to help build awareness and meaningful understandings of data collection and sharing practices. This research is important because insufficient knowledge of AI and algorithms contributes to exclusion from online spaces, tech-facilitated discrimination, exposure to harmful content, and various privacy risks.

Algorithms Among Us

  • Participants highlighted what they see as a concerning tension: that while their connected devices can provide them with the world at their fingertips, algorithms often narrow the type of content they see. Youth are concerned about content saturation (seeing too much of the same thing) that contributes to a false sense of social consensus. Additionally, young people are aware of the impacts of algorithms in their online environments and are often frustrated by a sense of powerlessness to change this algorithmic architecture.

You get sick and tired of the content; it’s [the algorithm] like, ‘Oh, you like this? Take every single bit of information!’ (Conor, 15)

  • Youth demonstrated familiarity with recommendation spirals, describing them as ‘mindless scrolling,’ ‘boredom,’ ‘rabbit holes,’ and ‘focus thieves.’ They are aware that algorithmic pre-selection pushes them towards more passive uses of the internet and are annoyed by specific optimization strategies like clickbait content. Young people don’t like being ‘duped,’ scammed,’ or ‘manipulated’ by online platforms or content creators.
  • Participants were skeptical of an algorithm’s ability to present them with accurate information, especially when searching for information for a school project, since recommendation algorithms produce an excess of trendy information rather than reliable or trustworthy information.

Under the Algorithm's Lens

  • Youth are acutely aware of the value of personal data for online businesses, especially as everything “boils down to advertising” (Nathaniel, 16). While most participants had few reservations about the use of personal information by algorithms recommending relevant entertainment and leisure content, they were concerned about ‘creepy’ and ‘invasive’ corporate surveillance strategies.
  • Young people disliked the idea that their online information was being ‘lumped’ into categories of aggregate data to train algorithms and machine learning without their knowledge and, more importantly, without their consent.
  • Participants strongly disliked data brokering, which they felt was ‘scummy,’ unfair,’ ‘wrong,’ and ‘unethical.’ Youth commented on the potential social and political implications of brokering, especially since users were almost always unaware of these processes. They were clear that selling their data without their knowledge and meaningful consent was a ‘violation of their privacy’ (Erin, 17).

I don’t think it’s fair because we haven’t given them [platforms] permission to take our data, but then they [platforms] are the ones usually benefitting from it. (Rayleigh, 13)

The (Algorithmic) Ties that Bind Us

  • Many youth were not aware of proxy data and how things like race, gender, sexual orientation, or health status could be inferred from other data and used to build more complete data profiles mobilized by online businesses. After learning about these machine learning processes, participants described them as ‘kinda weird,’ ‘creepy,’ ‘strange,’ ‘disappointing,’ and even ‘evil.’

I didn’t realize how much the [platform] could connect all of that just with machine learning and AI. (Andrew, 15)

  • Many participants were familiar with the concept of bias in relation to digital technology, algorithms, and AI. Youth were aware of how biased data translates to “wrong answers [or] the wrong information” (Sahil, 15) and how this can have repercussions for both online businesses and users. Some even warned platforms to ‘be careful’ about relying on stereotypes or generalizations.

You have to be careful about that because if you’re stereotyping something like gender or a race and showing them certain ads just because you think they will enjoy it because they are white or something, you have to be careful around that. (Charlotte, 16)

 

  • In describing these troubling algorithmic assumptions, participants used language like ‘upsetting,’ ‘dangerous,’ ‘terrible,’ and ‘unfortunate.’ They questioned the fairness of these practices and were concerned for people who are already placed at risk of experiencing racism, marginalization, or discrimination.

I guess there could be particular places where people might already face a lot of discrimination [and to] have a computer doing that too is kind of brutal. (Hailey, 16)

  • Finally, there was an acknowledgment from participants that algorithms, or algorithmic systems, should not be left to their own devices and a call for developers and platforms to be more aware of the consequences of relying too heavily on this technology.

The algorithm isn’t human. It doesn’t understand emotions or what is right or wrong – it only understands what it’s being given. (Sahil, 15)

Recommendations

The recommendations from this qualitative research project echo what participants told us about the need for more awareness, transparency, protection, control, and engagement.

AWARENESS

  • Youth expressed a need and desire for more robust algorithmic literacy tools and resources to better understand how algorithms work and the impact of artificial intelligence and machine learning on their lives. We recommend new algorithmic literacy curricula tailored to the unique needs of children and youth to encourage critical thinking skills, raise awareness about their privacy rights, and empower young Canadians to take control over their personal information.

Yeah. I definitely wish that we could understand more what is going on because technology is kind of taking over our lives, but we don’t know about it. (Nicole, 14)

TRANSPARENCY

  • Youth want more information about how their personal data is collected, stored, and brokered, and they called for more transparency from online businesses. We recommend enhancing algorithmic transparency through clear and accessible data collection and privacy policies.

PROTECTION

  • Youth want more protection online, especially when it comes to platforms sharing or selling their data profiles. They want to mitigate the future unintended consequences of AI and data sharing practices. We recommend that online businesses and policymakers consider data erasure policies.

CONTROL

  • Youth asked for more reporting features to hold platforms accountable by taking action when they notice harmful or discriminatory content. They also asked for more control over their data, and want to decide when to share personal information. We recommend ongoing and more meaningful consent processes and solutions identified by youth in previous MediaSmarts research.

I think it’s very important for people to have the choice to learn why they are seeing certain types of ads. It gives the user more freedom over what they see or more control [over] what they see on their pages. (Ethan, 17)

ENGAGEMENT

  • Participants appreciated the time, space, and opportunity to talk more intentionally about AI, algorithms, and the digital privacy issues that directly impact their day-to-day lives. We recommend future research projects that continue to build our knowledge of algorithmic literacy levels and engage with children and youth in a way that positions them as experts.

This research was made possible by the financial contributions from the Office of the Privacy Commissioner of Canada’s contributions program.