
Developing an AI Course Policy in the Age of ChatGPT

ChatGPT’s impact on higher education has been evident since its release late last year: while it may not usher in the sort of revolution that some of the hype suggests, it is certainly changing the way students work and learn. The rise of ChatGPT prompts teachers to reflect on how best to facilitate student learning in today’s classroom. While different teachers may disagree about how the tool can best be used, its presence must be addressed one way or another, lest it become the elephant in the room.
An obvious starting point is crafting a course policy for the use of ChatGPT and other AI tools. Besides addressing a technological exigency, developing a course policy for AI is also a chance to further engage and motivate students. Here are a few key considerations for developing a course policy in the age of AI:
Level of permissiveness
It’s worth remembering that ChatGPT is just one AI-powered tool among many, even though it gets most of the press. Grammarly has been long used (and sometimes recommended) for students to improve their grammar, and it uses AI to make its recommendations. Canva is another popular AI-powered software platform that provides aid in creating layouts for documents of all kinds. AI tools have been built into Microsoft 365 products such as Word, PowerPoint and Excel—and their capabilities will expand. These are just a few examples in a deepening sea.
Which tools do you allow students to use on coursework, and when? Some teachers may require students to use AI tools to complete their work, while others may make certain tools optional, and others may prohibit all such tools outright. The choice should reflect the course’s expected learning outcomes and the teacher’s pedagogical approach.
Whatever the decision, you should communicate it to students as clearly as possible. One pain point for students at present is ambiguity as to which software is allowed and not, and what uses of software constitute academic integrity violations. This is an easy issue for instructors to resolve.
Transparency
Sometimes policies are communicated as a decree from on high. We don’t need to justify our choices, though perhaps we should. Space and time permitting, more transparency is better. You should strive, for example, to explain why a tool is required, optional or prohibited. Students may be less likely to use a tool outside the allowed framework (that is, cheat) if they understand your pedagogical reasoning.
AI Literacy
An AI policy on your syllabus can be an opportunity to explain the nature—and the dangers—of tools students may have been using without much reflection. For example, ChatGPT’s fluent outputs suggest authentic communication with an “intelligent” agent, when in reality they are the products of a mathematical algorithm that replicates training dataset biases without critical thinking or moral judgement. As future professionals and citizens of an AI-saturated society, student need to go beyond a “user experience” approach and understand that AI-tools require expert supervision and critical discernment.
Acknowledgement
Instructors may or may not ask students to acknowledge which AI tools they used in their work. At one end of the spectrum, a teacher may require students to append a short reflection about why they used the tool and how they feel it improved their work. At the opposite end of the spectrum, perhaps there’s no need to mention the tool at all. In other cases, perhaps it’s enough to include the tool in the reference list along with other sources.
The appropriate way to acknowledge AI tools will certainly evolve over time. After all, we typically don’t expect students to reflect on or even disclose their use of spell-checkers and Google, though these obviously help them in their work. That said, there may be certain courses where even these technologies should be disclosed and reflected upon, depending on the course’s expected learning outcomes.
Tone
The tone of the classroom policy, like the tone of the syllabus generally, should also be considered. Is your policy written in stern, legalistic language, or is it friendly and warm? While each teacher has their own style, research does suggest that friendly and explanatory policies support student learning outcomes better.
Faculty Workload
Policy choices often affect not only the student’s workload but the teacher’s, too. Instructors should consider workload when devising any course policy. For instance, if you commit to checking sources more rigorously, redesigning assignments, running student work through AI detection tools (which have proven to be notoriously unreliable), etc., these policy choices may create additional labor for you.
Person in Focus
Typically, course policies focus on the student: what the student must do, and what will happen if they don’t do it. But, given the potentially transformative nature of AI, it may be worth communicating your policy more fully to include yourself and other stakeholders. What do you commit to do for your students? How is AI technology changing your approach? How does this relate to student learning? And how might you reflect upon all this with your students?
Collaboration
Lastly, consider who will develop the course policy. Traditionally, instructors (following institutional guidance) construct the course policies, and students are then expected to adhere to them. But the growing student-centered learning movement suggests that there may be value in involving students in the process of developing course policies, designing assignments, and selecting topics of study.
In this light, a policy regarding AI technologies could be developed in collaboration with, or even entirely by, students. Again, this choice will depend on a course’s particular expected outcomes. In the case of AI, there may be especial value in involving students in the policy development process. They are using the technologies already, and they have not only opinions and experience but also valuable insights as to what works best and why.
Developing a course policy regarding AI may seem like just another thing on your to-do list, but it is an opportunity to reflect on your teaching practice and deepen student learning. Here are two documents to get you started: Drexel University's policy on academic integrity pertaining to artificial intelligence and a compilation of academic classroom policies for generative AI tools from across academic institutions and disciplines (curated by Lance Eaton).
Contact Us
3401 Market Street
Philadelphia, PA 19104