Skip to Main Content

Innovation Center for Teaching & Learning

Instructional Design Help

Tim Lockman
Instructional Designer

Headshot of Instructional Designer Tim Lockman in his work office

For help with course design, creating course content, and professional development, please contact Tim at tlockman@kish.edu or schedule an appointment below. 

Ethics and Policies for AI Use

Guidance for Kish Faculty

Research

Survey Source:

Inside Higher Ed, 8/29/25 (under heading: "Using Generative AI for Coursework")

Overall student use of generative AI in the past year:
  • Have used generative AI for coursework: 85%
  • Have not used generative AI for coursework: 21%
Overall comments on the survey:
  • Half of students who use AI for coursework say it’s having mixed effects on their critical thinking abilities
  • A quarter report it’s helping them learn better
  • Nearly all respondents "want their institutions to address academic integrity concerns—albeit via a proactive approach rather than a punitive one"
Students used generative AI for coursework in these ways:
  • Brainstorming ideas: 55%
  • Asking it questions like a tutor: 50%
  • Studying for exams or quizzes: 46%
  • Editing writing or checking work: 44%
  • Using it like an advanced search engine: 42%
  • Generating summaries: 38%
  • Outlining papers: 31%
  • Completing assignments or coding work: 25%
  • Generating citations: 26%
  • Writing free responses or essays: 19%

First Steps

When addressing AI use in your courses, what are your first steps? Before anything else, the ICTL advises doing three things: 

  1. Write an AI use policy for each of your courses
  2. Update your academic integrity policy
  3. Communicate your policies to students

Here are some tips on how to do this:

Describe acceptable use of AI in your course
  • Review these Sample Syllabus Statements from Kish's Academic Standards Committee. Use or adapt these for your own syllabi. 
  • Review the Board of Trustees Policy Manual's Statement on AI Use (2nd page): Consider the College’s guidelines for Kish employees using generative AI. Requirements include thinking critically about output; maintaining confidentiality; and disclosing the use of AI and avoiding misrepresenting ownership (must cite sources). Should we hold students to the same standards as college employees? 
Include consequences for inappropriate use of AI
  • Review and update your Academic Integrity Policy: Be sure that your syllabus's academic integrity policy clearly defines acceptable use of AI tools in your course and lists consequences for noncompliance.
  • You decide what acceptable use means in your course: What student uses of AI are appropriate and inappropriate for your course? For example:
    • “Grade me”: Student uploads assignment rubric and their first attempt, letting AI “grade” them as practice for a test
    • AI helps them brainstorm topics
    • Corrects grammar, spelling, tone, and other writing mechanics
    • AI writes or drafts an outline for them
    • “Be the Human”: AI writes a draft of the assignment; students write a response critiquing the output
    • How do you want students to cite or acknowledge AI-generated material?
    • For more potential uses, please see "Survey: How Are College Students Using AI" above
Talk to your students about AI and integrity!

Dr. Tricia Bertram Gallant is co-author of The Opposite of Cheating: Teaching for Integrity in the Age of AI. When she spoke at Kish's Fall 2025 Inservice, she commented that students are "thirsty to talk about ethics," and in her book she gives some tips for having these conversations.

  1. Communicate about integrity early and often, using syllabi policy statements and in-class conversations (Opposite of Cheating, p.55).
  2. Consider placing integrity reminders ("nudges") within assignment and assessment prompts (Opposite of Cheating, pp. 45-51).
  3. Tell stories of consequences former students have faced, or use examples from pop culture. Stories linger and are more meaningful than directives. Trigger their emotions and have them identify with the subject of the story (Opposite of Cheating, pp. 96-7).
  4. For more ideas and examples, please see Chapter 2, "Communicating Integrity," in The Opposite of Cheating. If you need a copy of the book, please contact Instructional Designer Tim Lockman at tlockman@kish.edu.

Guidance from Quality Matters™

Quality Matters (QM) is a faculty-driven online course quality program. Kish is an institutional member of QM. 

QM's basic AI awareness steps are shown in the table below. "SRS" means Specific Review Standard. SRSs are part of the QM HE Rubric Workbook

For Basic AI Awareness and Communication (All Courses)

🔴 SRS 1.2: Provide comprehensive AI integration overview in course introduction

🔴 SRS 1.3: Provide communication protocols for AI

🔴 SRS 1.4: Clearly state AI policies and academic integrity expectations

🔴 SRS 6.4: Inform students about data privacy implications of AI tools

🔴 SRS 7.1: Ensure technical support covers required AI tools

For more detailed recommendations, please see the Quality Matters AI-Integration Rubric linked under "For More Detailed Guidance" below. This rubric recommends actions on AI use that align with QM’s 8 General Standards and are based on how AI will be used in your course

For more detailed recommendations, please see the Quality Matters AI-Integration Rubric linked below. This rubric recommends actions on AI use that align with QM’s 8 General Standards and are based on how AI will be used in your course

Quality Matters AI-Integration Rubric

This rubric is a self-review tool and is presented here in two parts: One document for Critical (most important) updates and one for Priority (highly recommended) updates.

If you need help with Quality Matters recommendations, or if you need a copy of the QM HE Rubric Workbook, please contact Instructional Designer Tim Lockman at tlockman@kish.edu.