Classroom in session
Image generated by DALL-E 3

Creating AI-Resistant Assignments, Activities, and Assessments

Instructors can take several proactive steps to design assignments, activities, and assessments that are more resistant to academic dishonesty in the age of AI, including Generative AI tools. Here are some strategies you might want to try:

1.    “Going Medieval”: Reintroducing traditional pen-and-paper assessments for certain assignments or exams can significantly reduce the temptation and ability to use AI dishonestly. This method requires students to articulate their thoughts and demonstrate their understanding without the immediate assistance of digital tools. Handwritten submissions emphasize comprehension and the ability to communicate ideas effectively without relying on AI for content generation or editing. While not applicable to all types of assessments, especially those that require digital skills, pen-and-paper tasks can be particularly effective for testing foundational knowledge, mathematical calculations, conceptual diagrams, or in-class essays. This approach also encourages students to develop their ideas independently and can be combined with oral exams or in-person presentations to further ensure academic integrity.
2.    Emphasize Originality and Critical Thinking: Design assignments that require students to apply concepts to new situations, solve complex problems, or integrate multiple sources of information. These tasks are harder to outsource to AI and emphasize students' unique perspectives and critical thinking skills.
3.    Use Personalization: Tailor assignments and questions to individual students or current events. Personalization makes it difficult for students to find existing AI-generated answers or content that matches their specific assignment criteria.
4.    Incorporate Reflective Components: Ask students to include a reflection on their learning process, the challenges they faced, and how they overcame them. This not only encourages deeper engagement but also makes it more challenging to use AI tools dishonestly.
5.    Implement Process-Based Assessments: Break larger assignments into stages (proposal, outline, draft, final submission) and evaluate each stage. This approach allows instructors to monitor progress and understand students' thought processes, making it harder for students to completely rely on AI.
6.    Adopt Oral Examinations or Presentations: Oral exams or presentations require students to demonstrate their knowledge and thinking in real-time, making it difficult for them to rely on AI-generated content without a deep understanding of the material.
7.    Utilize Open-Book, Open-Note Exams: Design assessments that allow students to use resources but test their ability to apply, analyze, and synthesize information rather than recall facts. This can reduce the temptation to use AI inappropriately.
8.    Incorporate Peer Review: Have students review and provide feedback on each other’s work. This not only exposes students to different perspectives but also makes it more difficult to pass off AI-generated content as their own, especially if they have to defend their work.
9.    Leverage Technology for Verification: Use plagiarism detection software that is updated to recognize AI-generated content. Although not foolproof (there is a significant chance for false positives or negatives), it may serve as a deterrent and a tool for helping to identify potential misuse of AI. At this point in its development, however, it should not be used as the sole identifier of academic dishonesty. 
10.    Test Assignments with Generative AI: Before finalizing your assignments, use Generative AI tools to attempt them. This step helps to identify if the assignments can be easily completed by AI, potentially bypassing deep understanding or genuine student engagement. Assignments that can withstand AI completion typically require nuanced thinking, personal insights, or complex synthesis of information, making them more resistant to digital shortcuts and emphasizing students’ original contributions. Testing ensures the assignments truly assess critical thinking and comprehension, rather than just the ability to retrieve information.
11.    Educate About the Ethical Use of AI: Teach students about the ethical considerations of using AI tools, including what constitutes legitimate use versus academic dishonesty. Clear guidelines can help students navigate the use of AI responsibly.
12.    Promote a Culture of Integrity: Beyond specific assignments or technologies, fostering an educational environment that values honesty, integrity, and hard work can discourage academic dishonesty. This includes having open discussions about the challenges and implications of AI in academia.
13.    Experiment with Cheating-Resistant Assessments: Design assessments that are unique and context-specific, making it challenging for third-party services to provide relevant assistance. This might include complex, course-specific projects or problems that require a deep understanding of the subject matter.
 

Using these strategies, you can create a learning environment that not only discourages academic dishonesty but also encourages genuine learning, critical thinking, and ethical engagement with AI technologies.

 

Detailing AI Usage


If you are going to allow students to use Generative AI when completing assignments, you may wish to have them detail that usage. How did they use it? Why did they use it where they did? What prompts did they use? Having students detail their AI usage when completing assignments can offer a range of benefits, both educationally and ethically. Here's a breakdown of some key advantages:

1.    Transparency: By detailing AI usage, students promote transparency. This allows educators to understand how the work was completed and the extent of the students' engagement with the material.
2.    Learning Reflection: Detailing AI usage encourages students to reflect on their learning process, considering not just the final answer but also how they arrived at it. This metacognitive process can deepen understanding and encourage self-regulated learning.
3.    Academic Integrity: It upholds principles of academic integrity. Students are being honest about their sources and methods, which is crucial in maintaining trust and fairness in the educational environment.
4.    Assessment Accuracy: When educators are aware of AI usage, they can more accurately assess a student's comprehension and abilities, tailoring support and instruction to the student’s actual needs.
5.    Skill Development: By tracking AI usage, students can be more conscious of the skills they are developing and those they are outsourcing to AI, such as critical thinking, problem-solving, and creativity.
6.    Ethical Awareness: It instills a sense of ethical responsibility in students regarding the use of technology. Understanding the boundaries of AI assistance is crucial as they enter a workforce where AI tools are commonplace.
7.    Collaboration Skills: It could encourage collaboration between students and AI, as detailing usage requires understanding how AI works and how to integrate it effectively into their workflow.
8.    Pedagogical Insights: For educators, understanding how students use AI can provide insights into how AI can be integrated into the curriculum to enhance learning outcomes.
9.    Digital Literacy: Students learn to be discerning in their use of digital tools. Detailing AI usage means they need to understand the capabilities and limitations of these tools, contributing to their digital literacy.
10.    Responsibility and Control: It helps students maintain control over their learning process, using AI as a tool rather than a crutch, ensuring that the learning objectives of the assignment are still met.

In essence, detailing AI usage in assignments helps ensure that the educational focus remains on student growth and learning while also fostering a culture of honesty, accountability, and ethical use of technology.