Reporting Platforms and Changing the Vibe

Written by Dr. Samantha McAleese

Here at MediaSmarts, we’ve just wrapped up another research project called Reporting Platforms: Young Canadians Evaluate Efforts to Counter Disinformation. This project created space for youth from across Canada to examine and assess reporting processes on popular social media apps (like Instagram, TikTok, Twitter, and YouTube). We wanted to learn more about how young people feel about current efforts to counter misinformation and disinformation and what solutions they might have to address this particular online harm. To do this, we facilitated three focus groups with 36 participants ages 16 to 29, talked to them about how often they see misinformation and disinformation online and what they do about it, and asked them to comment on whether they trust platforms to keep them safe and informed while scrolling and sharing.  

Science evolves! And so does (and should) health policy and the scientific consensus

Studies have shown that communicating the scientific consensus on a topic can be a helpful strategy in the fight against misinformation. For example, a 2015 study found that “emphasizing the medical consensus about (childhood) vaccine safety is likely to be an effective pro-vaccine message.”

Introducing #ForYou: A Game About Algorithms

Not many words have had a rise as meteoric as the term “algorithm.” Once only familiar to mathematicians or computer scientists, today algorithms are the subject of warnings from scholars and activists, protested by students whose future lives and careers are increasingly determined by algorithmic decision-making, personified and catered to by would-be YouTube stars, and seen as the almost magical element that is vital to the success of newer platforms such as TikTok.

Algorithmic Awareness: Conversations with Young Canadians about Artificial Intelligence and Privacy

This blog was written by Samantha McAleese and three youth participants – Sahil, Erin, and Kate (pseudonyms used to maintain anonymity).

Reflections on Conducting Qualitative Research During a Pandemic

What comes to mind when you hear the word algorithm? Can you explain how machine learning works? Do you have any privacy or data collection concerns regarding the increased reliance on artificial intelligence? These are just a few of the questions that we asked young people in our recent qualitative research project Algorithmic Awareness: Conversations with Young Canadians about Artificial Intelligence and Privacy. From November 2020 to January 2021, we facilitated eight focus groups with 22 youth ages 13 to 17, where we combined game-based learning with discussion and reflection to gain insight into how young Canadians understand the relationship between artificial intelligence, algorithms, and privacy.

Young Canadians Speak Out: A Qualitative Research Report on Privacy and Consent

Informed Youth Promote Clarity for All

It’s something we’ve all done before: scrolled past a wall of text to click “I Agree” with no idea what we’ve agreed to. Then, when we’re using the platform, messages like “We’ve made some changes to our Terms and Conditions” simply remind us that we probably didn’t read them in the first place. Our world is becoming more and more influenced by the data that’s being collected about us. For young people in particular, this can lead to serious and unexpected consequences that could affect their entire lives.

How we can empower youth to push back against hate online

In its early days, the internet was often spoken of as a free marketplace of ideas, where everyone’s views and thoughts could be shared and compete on an equal footing. Today it’s an essential tool for accessing information and services, but its value as a vehicle of civic engagement and debate has in many ways declined.