AI Considerations for Faculty

In this guide you will find information and guidelines to help you make informed decisions about navigating AI tools. Engaging with generative AI tools means using a thoughtful, critical and ethical lens to determine whether their use will benefit your assignments and assessments, as well as considering how your students may independently be trying to engage with these tools in their learning, either productively or in ways that may challenge their work’s academic integrity.

Building literacy in Generative AI, whether you choose to use it or not, includes addressing ethics, privacy, and equity with intention.

AI policy considerations

Currently there is no university-wide policy on AI use for students or instructors. Please refer to the Academic Integrity policy

It is your responsibility to determine how or if you allow your students to use these tools in your class. If you choose to prohibit the use of generative AI tools, you should state this policy clearly to your students, being specific about what "AI use" encompasses.

If you choose to allow your students to use generative AI, you should clearly communicate your expectations for how students can engage with these tools. Keep in mind that the expectations for AI use may vary from assessment to assessment. A helpful tool for communicating appropriate AI use for your course or assignments is the 5-level AI Assessment Scale (AIAS). The levels range from:

  • No AI - The assessment is completed entirely without AI assistance in a controlled environment, ensuring that students rely solely on their existing knowledge, understanding, and skills.
    • Sample AI statement - You must not use AI at any point during the assessment. You must demonstrate your core skills and knowledge.
  • AI Planning - AI may be used for pre-task activities such as brainstorming, outlining and initial research. This level focuses on the effective use of AI for planning, synthesis, and ideation, but assessments should emphasise the ability to develop and refine these ideas independently.
    • Sample AI statement - You may use AI for planning, idea development, and research. Your final submission should show how you have developed and refined these ideas.
  • AI Collaboration - AI may be used to help complete the task, including idea generation, drafting, feedback, and refinement. Students should critically evaluate and modify the AI suggested outputs, demonstrating their understanding.
    • Sample statement - You may use AI to assist with specific tasks such as drafting text, refining and evaluating your work. You must critically evaluate and modify any AI-generated content you use.
  • Full AI - AI may be used to complete any elements of the task, with students directing AI to achieve the assessment goals. Assessments at this level may also require engagement with AI to achieve goals and solve problems.
    • Sample statement - You may use AI extensively throughout your work either as you wish, or as specifically directed in your assessment. Focus on directing AI to achieve your goals while demonstrating your critical thinking.
  • AI Exploration - AI is used creatively to enhance problem-solving, generate novel insights, or develop innovative solutions to solve problems. Students and educators co-design assessments to explore unique AI applications within the field of study.
    • Sample statement - You should use AI creatively to solve the task, potentially co-designing new approaches with your instructor.

Each level identifies and sanctions different ways students can use AI in appropriate and meaningful ways to support their learning.

Addressing generative AI and academic integrity with students

It is your responsibility to clearly and thoroughly communicate your expectations to your students, regardless of whether or not you choose to allow the use of generative AI in your course. We recommend including a statement in your syllabus or with each assignment that outlines acceptable use of AI. When creating your statement(s) on AI, you may consider these questions:

  • Will I allow the use of generative AI in my course?
  • What are my expectations for students who use generative AI? For what purposes can students use these tools? Can they use them for brainstorming? Proofreading? Composing text?
  • How do I define appropriate and ethical usage for generative AI? What are my parameters?
  • What constitutes academic misconduct within my course with respect to the use of generative AI?
  • How will I ensure students are aware of any applicable privacy policies?
  • How will I require students to disclose and/or cite their use of generative AI? Will I create my own guidelines or have them follow existing citation guidelines (MLA, APA)?
  • How will I ensure my students understand their responsibility for AI-generated content?
  • Are there assignments where my expectations differ from the guidelines presented at the beginning of the semester? If so, how will I convey this to students?
  • If you are using AI in your courses, how are you disclosing this information to students?

Using authentic assessment to reduce AI reliance

Authentic assessment in courses can reduce student reliance on AI tools (like ChatGPT or Copilot) by designing tasks that make AI less useful - or at least, less able to do all the work. Authentic assessment helps builds transferable skills like engagement, critical thinking, and the kind of judgment students need in professional settings. Authentic assessments don't pretend that we live in a pre-AI world - this shift can help students better understand where and how they may use AI as a tool.

Personalization makes copying harder

Assignments grounded in students’ experiences or local context are difficult for AI to complete convincingly. Example: “Analyze a policy in your local school district using course principles” is harder to outsource than “Define educational equity.”

Emphasize process over product

Ask for drafts, outlines, or learning journals as part of the grading. Students can't just plug in a prompt and turn in a final piece - they must show how they got there. This discourages last-minute, AI-generated submissions.

Incorporate multi-modal or project-based tasks

Creating an infographic, presentation, or tutorial video requires synthesis and originality beyond what AI alone can generate. These tasks promote higher-order thinking—analysis, synthesis, creativity.

Real-world relevance requires decision-making

Authentic tasks often require judgment calls and justifications that reflect student learning. Incorporating real-world relevance acknowledges the existence of AI. AI might help, but the final product still depends on the student's input. Example: “Design a health intervention for your community, explain your design choices, and anticipate limitations.”

Frequent, low-stakes, iterative tasks

Breaking a larger assignment into smaller chunks over time (with feedback) builds accountability and reduces the temptation to rely on AI shortcuts. This also allows students to demonstrate their thinking and progress for each phase of the process.

References and resources