To prevent cheating, educators should focus on teaching students how to use AI responsibly rather than banning it. This includes showing them how to use AI for brainstorming, research support, or editing—not just copying answers. Teachers can design assignments that require personal reflection, class discussions, or step-by-step processes so it’s clear when work is original. Using tools like AI detection software can also help monitor misuse, but building a culture of integrity and learning is even more powerful. AI detection tools can be inaccurate. Furthermore- why would you not want students yto use these tools to help them? The key is to to not seek opportunities to "catch" students using AI. In my opinion, it would be smart to use AI as an assistant to enhance their work. We need to teach students how to use it responsibly and ethically.
Khan Academy is one tool I have used with students. It is a wonderful! It adjusts to student skill levels. It analyses their answers to provide appropriate skills to adapt to their skill level. Also, it reteaches them from their incorrect response. It is like a tutor. Then, it gives them another problem, similar to the one they were misunderstanding before. In this way, students have another opportunity to apply their new understanding. I have recently begun exploring ChatGPT & Canva. I am simply amazed at the variety and specificity that these AI supports provide for teachers and students.
Khan Academy is one tool I have used with students. It is a wonderful! It adjusts to student skill levels. It analyses their answers to provide appropriate skills to adapt to their skill level. Also, it reteaches them from their incorrect response. It is like a tutor. Then, it gives them another problem, similar to the one they were misunderstanding before. In this way, students have another opportunity to apply their new understanding. I have recently begun exploring ChatGPT & Canva. I am simply amazed at the variety and specificity that these AI supports provide for teachers and students.


