Resources for Teachers
This is the third lesson in the Critical Thinking Across the Curriculum series, though it can also be delivered independently. In it, students learn how we can be persuaded by emotional appeals as well as by arguments. After identifying emotionally charged words, they find them in an article and analyze their persuasive effect. Students study a public service announcement to examine how images and story can be emotionally persuasive, then watch a pair of videos to compare how they use emotional persuasion. They then conduct a red teaming exercise to identify the possible risks or drawbacks of using emotional appeals and ways of mitigating those. Finally, they create their own persuasive work using emotionally charged languages, images and music.
The Critical Thinking Across the Curriculum workshop provides teachers with a comprehensive approach to fostering critical thinking skills at all grade levels, with particular attention to addressing online misinformation and online hate through teaching critical thinking. The workshop looks at exactly what critical thinking is and why we need to teach it; explores how we can teach students how to think critically, including looking at different ways of approaching it for different parts of the curriculum; and finally, looks at how we can get students – and ourselves – to switch to using better shortcuts to critical thinking.
In this lesson, students learn about algorithms and AI, how they work, how they impact our lives on the internet, and ethical considerations. The lesson begins with a class discussion on algorithms. Students will discuss how AIs reinforce real-world biases, the difficulties in identifying how AIs make decisions, what information algorithms use to make choices, and how that information impacts the types of decisions AIs make. Finally, students will demonstrate their knowledge by researching and designing an infographic on a field that uses algorithms to make decisions. This lesson aims to build critical thinking skills by examining how AI algorithms work, investigating the biases and impacts of AI decision-making, and reflecting on how the implications to their own lives.

Resilience Through DigitalSmarts: Digital Literacy for Survivors of Technology-Facilitated Violence is designed to provide digital media literacy resources for practitioners in the violence against women (VAW) sector and survivors of family violence–specifically those who have experienced or are experiencing technology-facilitated violence and abuse.

This lesson package is designed to be modular, allowing teachers to choose activities that are most relevant to their students. The lesson includes: an opening “minds on” activity that introduces essential concepts of election-related misinformation, helps students retrieve prior knowledge, and shows the relevance of the topic;  several activities which teachers can choose from based on the needs and context of their classes; a closing activity that introduces students to different strategies for verifying election-related information, including the idea of turning to a best single source (in this case, Elections BC). They then learn and practice engaging in active citizenship by responding to election-related disinformation.

In this lesson, students learn that video games are unlike other media because they are interactive, allowing players to do things and make choices. They then explore the idea of affordances and defaults by considering the “video game verbs” that different games allow you to do. They consider the commercial, technical, and genre reasons why some verbs are more often possible than others and then create a simple design for a video game in which players are able to do a wider variety of things.

In this lesson, students consider the ways in which social media may prompt them to compare themselves with others, and the impacts that can have on body image and self-esteem. They analyze how the features, algorithms and culture of the social networks they use may affect them and will produce “paper prototypes” of redesigned social media apps that promote more healthful use. Finally, students reflect on how they can change how they use the existing apps to be more like their redesigned versions.

Everywhere we turn, we’re hearing about artificial intelligence (AI). We already know AI is all around us – algorithms are suggesting what to watch and tools like ChatGPT and Midjourney are being used to generate the content we’re seeing.  But how many of us actually understand what algorithms even are? And if you’re a parent, guardian or teacher, are you prepared to teach youth how to use AI responsibly? 

In this lesson, students learn about media as a source of information, and how this information is presented from a particular point of view.

In this lesson, students learn how their personal information is key to how most of the “free” apps and platforms they use make money. They learn practical strategies and tools for managing their privacy and plan how these can be used to limit what audiences have access to their personal information.