Addressing AI in the Classroom: Tips for Teachers

Teachers can play a critical role in educating their students about AI, even if they aren’t experts on it. By addressing AI in ways that are designed to build students’ skills and teach them about the pitfalls of relying too much on it, teachers can help build the next generation of citizens who are empowered with the skills necessary to succeed in a world infused with AI technology. 

Teachers don’t have to feel nervous about AI in the classroom – but we do need to know more about it: just a third of teachers say they’ve been given training on how students can use AI responsibly.[1]

We also know that students want to learn more about AI and chatbots and are looking for clearer guidelines on how they can use them more effectively and ethically.

While it is important for teachers and students to understand how AI works, and the issues that come with using it, the utility of AI depends on its user as much as the AI itself. As much as we have to acknowledge its limitations and implications, AI can be a useful tool if it is used critically and responsibly.

While AI tools can be helpful for learning, they can also decrease learning effectiveness depending on how they’re used. If students just copy and paste together AI-provided results, not only will they not learn anything but it will hurt their independence and autonomy.

Self-directed learning involves setting learning goals and plans independently, managing and evaluating one’s own learning processes, and taking responsibility for the learning outcomes. AI can help students learn as well as to set learning goals and plans, and evaluate progress.

Using generative AI, students can engage in one-on-one learning sessions. While we have to teach students the limitations of AI, it can play a valuable rule by allowing students to do things like:

  • Brainstorming topics for an assignment[2]

  • Suggesting improvements to a piece of student writing[3]

  • Refine research questions and narrow or expand topics[4]

  • Explaining a concept to them[5]

  • Making changes to a text (e.g. “rewrite to be more concise,” “rewrite so a small child could understand it”) and then analyzing the differences[6]

  • Writing a resume or business letter[7]

  • Play “devil’s advocate” or help them spot errors in their reasoning and question their assumptions[8]

  • Role-play situations like job interviews

In courses where the mechanics of writing are not being graded, AI can also be used to level the playing field for English as a Second Language students[9] or students with some disabilities.[10] Image generators can also be used to make custom artwork for student projects in courses where they are not being assessed on their artistic skills.

Here are some specific ideas for how you can use AI in the classroom:

Improving Questioning Skills: 

In the AI era, having the ability to think critically and creatively is more important than possessing a large amount of knowledge, making the skill of asking questions increasingly valuable. While the internet is used for searching, using generative AI requires asking questions, which can help train the skill of questioning. 

The ability to ask questions depends on how deeply and diversely one can inquire, so it is essential for students to increase both the depth and breadth of questions they can pose. One method of using generative AI for questioning is to start with a single question and then follow up with additional questions step-by-step in a chain-like manner.

An example of asking progressively deeper questions is the following:

1) 'What is the lifespan of a cat?'
2) 'Is there a difference in lifespan between house cats and wild cats?'
3) 'What are the significant reasons for the lifespan differences?'
4) 'Among these reasons, which is the most significant?'

An example of asking progressively broader is the following:

1) 'What is the lifespan of a cat?'
2) 'What is the lifespan of a dog?'
3) 'In the feline and canine animal families combined, which species has the longest lifespan?'
4) 'Which mammal has the longest lifespan?'

By asking generative AI these types of questions, students can develop their ability to think both deeply and broadly.

You can make writing prompts for AI a classroom exercise. Like search engine queries, most prompts don’t include enough information to take full advantage of the tool. Create a class account for a chatbot (you can use an anonymous email service like Protonmail or SharkLasers to register) and have students write a prompt that includes:

  • The chatbot’s role (this could be a specific one, such as taking on the personality or point of view of a historical figure or a fictional character, or a broad role such as a mentor or research assistant);

  • The purpose of the prompt (e.g. to give feedback, to share experiences, to play devil’s advocate, et cetera);

  • The desired qualities of the response (e.g. helpful, challenging, encouraging, et cetera); 

  • Any constraints or limitations on the response (things the chatbot should not do, e.g. “Do not refer to anything that happened after 1800,” “Do not give me the answer directly.”); and

  • The intended audience (the grade level, the subject, language level, et cetera.)[11]

These can be used at the beginning of a conversation with a chatbot as well as a single response.

Enhancing Debate Skills: Students can improve their debating skills by using generative AI to take on pro and con positions on specific topics. Students can conduct mock debates by presenting arguments from each perspective and offering counterarguments. This practice can help enhance their debating abilities.

Language Learning: Traditionally, language learning has relied on methods such as paper books, video lectures, and phone-based language learning. However, recently, useful language learning apps equipped with AI technology have emerged, enabling personalized learning. Generative AI is capable of conversation, making it useful for learning any language through dialogue. Additionally, the increasing number of programs that let users interact using their voices greatly aids in practicing speaking skills.

Using AI as a Creative Tool:

  • Writing: Students can use generative AI to write various types of content such as novels, poems, and blog posts. AI can help with everything from brainstorming topics and themes to improving structure and grammar, as well as correcting typos. Using generative AI as an assistant – not a crutch – can enhance your writing skills.

  • Image, Graphic Creation, and Design: Students can use simple text descriptions with generative AI to create images and modify them into desired styles and formats.

  • Music Composition: Generative AI has made composing, arranging, and performing music significantly more accessible.

  • Video Production and Editing: AI can be used for various aspects of video production, including audio, subtitles, editing, and graphics. Technological advancements have reached a point where entering text can result in a complete video making it easier to express yourself!

Generative AI can be particularly useful when students are making media in a course that is not about making media, such as if they want to make a poster in History class or put a song to music in English class. It’s important to be clear with students what parts of an exercise or assignment can be done with AI and which cannot. Like a calculator, what students are allowed to do with AI will depend on their grade levels: in the same way, you can let them use it to automate skills they have already mastered.[12] 

Thinking Critically About AI

Students should also be encouraged to think critically about AI by exploring issues like stereotyping, misinformation, privacy, parasociality and ethics.

“We need to find some sort of balance between ‘AI is going to rule the world’ and ‘AI is going to end the world.’ But that will be impossible to find without using AI in the classroom and talking about it at school.” Isabella Iturrate, 12th-grader[13]

Identifying Misinformation:

Generative AI can produce both misinformation and intentional disinformation. Unfortunately, just warning people about the risk of AI disinformation doesn’t help them spot it. In fact, it makes them more likely to think that real content is fake.[14]

When students are considering whether something might be a deepfake, encourage them not to rely on evidence inside an image or video itself, like  an image including humans with extra fingers: image generators’ ability to correct these is improving quickly. We also tend to be more skeptical of things we don’t want to believe are true, and almost any any photo or video will have some elements that let us dismiss it as a deepfake if we want to.

“Telling people to be skeptical is only the starting point. The harder question is how to decide what to trust. If we can’t rely on the content itself, the trustworthiness of the source becomes much more important.”[15]

Instead, teach students the information sorting techniques taught in MediaSmarts’ Break the Fake program: 

For example, using a reverse image search like Tineye can tell you quickly where a photo first appeared. That may not tell you if it’s a deepfake, but if it didn’t come from a reliable source that’s a reason to think it’s not real.

Looking for what other people say about a source (for instance, whether it has a Wikipedia entry, or if it’s linked to by reliable sources) tells you more than what a source says about itself.

Consulting fact-checkers and sources that you already know to be reliable (like legitimate news outlets) can also help sort fact from AI-fiction.

Help students understand how AI works and how that informs our understanding of how they “know” and “decide” things.[16]

Have students contrast generative AI with other sources of online and offline information. For example, what are the similarities and differences between a chatbot and a search engine? How is it different from Wikipedia? In each case, consider the processes by which information is constructed and whether we can consider it a “source.”

Have students ask a chatbot a question on a topic you’ve covered in class, then have them rate the response.[17] How accurate and representative is it? Does it give equally good answers when you change the prompt? (Many chatbots will correct common misconceptions if asked about them directly, but repeat them if the prompt is not specifically about them. For instance, a chatbot asked whether people in the Middle Ages thought the world was flat correctly identified that as a myth, but listed sailing off the edge of the Earth as an example of something Columbus’s sailors might have worried about.)

Personal ethics:

Students need to understand that intimate deepfakes aren’t “victimless” and do harm to the people portrayed. MediaSmarts’ research found that youth who share sexts without the original sender’s consent justify what they’re doing through moral disengagement mechanisms like denying the harm of what they’re doing. The lesson There’s No Excuse addresses this. The same goes for “prank” images or videos of people doing embarrassing things.

Academic ethics:

Help your students understand what plagiarism is and how it applies to AI.[18] If your school doesn’t already have a policy about using AI, have your class develop one together. Make a chart with examples of acceptable and unacceptable uses so they understand the difference.[19]

Don’t rely on AI detection or assessment tools. If you do use them, make them only one part of the academic honesty process and be aware that they frequently make mistakes.[20]

Have students document and show their process. One option is to have them use an online word processor that tracks changes made to the draft. By checking the version history, you can see if it was written over time or pasted in all at once (as it would be if it had been written by a chatbot.)[21]

Think about how AI can be integrated into process steps, instead of used to make finished work.[22]

If you allow students to use AI in their work, have them include their prompts and link to the results.[23]

Use a broader variety of assessment tools, including in-class writing and media-making.[24] Focus on having students demonstrate understanding and applied skills instead of rote learning.[25]

Teach students to cite and reference their sources. Explain that just like a search engine or an encyclopedia, chatbots should be seen as links to sources rather than sources themselves.[26] Have students double-check the sources and citations in any result they get from a chatbot, to make sure that they really exist and that they are accurately reflected in the text. 


 


[1] Dwyer, M., & Laird E. (2024) Up in the Air: Educators Juggling the Potential of Generative AI with Detection, Discipline and Distrust. Center for Democracy & Technology.

[2] Frank, D., & Johnson J.K. (2024) Working Alongside, Not Against, AI Writing Tools in the Composition Classroom: a Dialectical Retrospective. In Teaching and Generative AI. https://uen.pressbooks.pub/teachingandgenerativeai/front-matter/forweword/ 

[3] Frank, D., & Johnson J.K. (2024) Working Alongside, Not Against, AI Writing Tools in the Composition Classroom: a Dialectical Retrospective. In Teaching and Generative AI. https://uen.pressbooks.pub/teachingandgenerativeai/front-matter/forweword/ 

[4] James, A. B., & Filgo, E. H. (2023). Where does ChatGPT fit into the Framework for Information Literacy? The possibilities and problems of AI in library instruction. College & Research Libraries News, 84(9), 334.

[5] Spector, C. (2023) What do AI chatbots really mean for students and cheating? Stanford Graduate School of Education. https://ed.stanford.edu/news/what-do-ai-chatbots-really-mean-students-and-cheating 

[6] Frank, D., & Johnson J.K. (2024) Working Alongside, Not Against, AI Writing Tools in the Composition Classroom: a Dialectical Retrospective. In Teaching and Generative AI. https://uen.pressbooks.pub/teachingandgenerativeai/front-matter/forweword/

[7] Ferreira, J. (2023) ChatGPT a 'time-saver' for parents, teachers as Canadians share how they're using the AI tool. CTV News. https://www.ctvnews.ca/sci-tech/chatgpt-a-time-saver-for-parents-teachers-as-canadians-share-how-they-re-using-the-ai-tool-1.6419330 

[8] Costello, T. H., Pennycook, G., & Rand, D. G. (2024). Durably reducing conspiracy beliefs through dialogues with AI.

[9] Gupta, A., Atef, Y., Mills, A., & Bali, M. (2024). Assistant, Parrot, or Colonizing Loudspeaker? ChatGPT Metaphors for Developing Critical AI Literacies. arXiv preprint arXiv:2401.08711.

[10] Heidt, A. (2024) ‘Without these tools, I’d be lost’: how generative AI aids in accessibility. Nature. https://www.nature.com/articles/d41586-024-01003-w 

[11] Mollick, E., & Mollick, L. (2023). Assigning AI: Seven approaches for students, with prompts. arXiv preprint arXiv:2306.10052.

[12] Gupta, A., Atef, Y., Mills, A., & Bali, M. (2024). Assistant, Parrot, or Colonizing Loudspeaker? ChatGPT Metaphors for Developing Critical AI Literacies. arXiv preprint arXiv:2401.08711.

[13] Singer, N. (2023) Chatbot Hype or Harm? Teens Push to Broaden A.I. Literacy. The New York Times.

[14] Ternovski, J., Kalla, J., & Aronow, P. (2022). The negative consequences of informing voters about deepfakes: evidence from two survey experiments. Journal of Online Trust and Safety, 1(2).

[15] Kapoor, S., & Narayanan A. (2023) How to Prepare for the Deluge of Generative AI on Social Media. Knight First Amendment Institute. https://knightcolumbia.org/content/how-to-prepare-for-the-deluge-of-generative-ai-on-social-media 

[16] Frank, D., & Johnson J.K. (2024) Working Alongside, Not Against, AI Writing Tools in the Composition Classroom: a Dialectical Retrospective. In Teaching and Generative AI. https://uen.pressbooks.pub/teachingandgenerativeai/front-matter/forweword/ 

[17] Heaven, W.D. (2023) ChatGPT is going to change education not destroy it. Technology Review. https://www.technologyreview.com/2023/04/06/1071059/chatgpt-change-not-destroy-education-openai/ 

[18] Tight, M. (2023). Challenging cheating in higher education: a review of research and practice. Assessment & Evaluation in Higher Education, 1-13.

[20] Rudolph, J., Tan, S., & Tan, S. (2023). ChatGPT: Bullshit spewer or the end of traditional assessments in higher education? Journal of Applied Learning and Teaching, 6(1). https://doi.org/10.37074/jalt.2023.6.1.9

[21] Ford, D. (2024) Tracking the Draft: An Academic Integrity Policy for Cheating with AI. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/tracking-the-draft-an-academic-integrity-policy-for-cheating-with-ai/ 

[22] Berdanier, C. G. P., & Alley, M. (2023). We still need to teach engineers to write in the era of ChatGPT. Journal of Engineering Education, 112(3), 583–586. https://doi.org/10.1002/jee.20541

[23] Ferreira, J. (2023) ChatGPT a 'time-saver' for parents, teachers as Canadians share how they're using the AI tool. CTV News. https://www.ctvnews.ca/sci-tech/chatgpt-a-time-saver-for-parents-teachers-as-canadians-share-how-they-re-using-the-ai-tool-1.6419330 

[24] Vaidhyanathan, S. (2023) My students are using AI to cheat. Here’s why it’s a teachable moment. The Guardian.

[25] Tight, M. (2023). Challenging cheating in higher education: a review of research and practice. Assessment & Evaluation in Higher Education, 1-13.

[26] Eaton, S. E. (2023). Postplagiarism: transdisciplinary ethics and integrity in the age of artificial intelligence and neurotechnology. International Journal for Educational Integrity, 19(1), 23.

 

Meta

 

 

Disclaimer: Meta provides financial support to MediaSmarts. This tip sheet has been developed in collaboration between Meta and MediaSmarts. MediaSmarts does not endorse any commercial entity, product or service. No endorsement is implied.