Teaching and Learning with AI

AI is reshaping how students learn and how instructors design learning experiences. At WP, the goal is not to ban or embrace AI uncritically, but to align AI use with learning outcomes, academic integrity, and equity.

Start with learning outcomes

Decide what students should be able to do independently, what they can do with tools, and how you will assess authentic learning. AI can support practice and feedback, but it can also shortcircuit learning when used as a substitute for thinking.

Course design patterns (practical options)

  • Disclosure-based assignments: Students may use AI, but must disclose how and where it was used (tool, prompts, outputs, edits).
  • Process-focused assessment: Grade outlines, drafts, reflections, and revision history—not just final products.
  • Critical AI literacy: Have students analyze AI outputs for errors, bias, missing perspectives, or weak reasoning.
  • AI as a tutor, not an author: Use AI for practice quizzes, explanations, and feedback, while students produce original work.
  • Compare-and-improve: Students write a response, then use AI to critique it, then revise—and explain the changes.
  • Data/claim verification: Students must verify claims with credible sources and document verification steps.

Syllabus and assignment language (templates)

Consider including clear guidance in your syllabus about when AI is permitted, when it is restricted, and what disclosure is required. Below are example policy stances you can adapt to your course and departmental norms:

  • Prohibited: AI tools may not be used for this assignment/coursework except where explicitly allowed.
  • Restricted: AI tools may be used for brainstorming, outlining, or grammar assistance, but not for generating final submitted content.
  • Permitted with disclosure: AI tools may be used, but students must include a disclosure statement and attach prompts/outputs as an appendix.
  • Required: AI tools must be used for specific steps (e.g., critique), with reflection on limitations and verification.

Academic integrity in an AI world

Academic integrity policies still apply. AI changes the mechanics of cheating and the boundaries of collaboration, so clarity matters: define what counts as unauthorized assistance, what counts as authorship, and what counts as citation/attribution.

Equity, accessibility, and inclusion

  • Assume uneven access: Some students will have paid platforms, more experience, and better devices. Design assignments that don’t privilege tool access alone.
  • Support accessibility: AI can help with drafting and comprehension, but ensure accommodations follow approved processes.
  • Avoid surveillance overreach: Choose assessment methods that respect privacy, avoid unnecessary data collection. Use AI-detection tools with skepticism; they have not proven consistent or reliable and must include human review and judgement.
  • Teach the meta-skill: help students learn when not to trust AI and how to check it.

Faculty development and support

Faculty development will include workshops, peer exchange of assignment designs, and shared repositories of syllabus language, rubrics, and AIaware assessment ideas. Documenting teaching experiments (what worked, what didn’t) helps the institution learn collectively.