Skip to content
AI-generated face overlapped with programming code

Do you need a Syllabus Statement for ChatGPT?

During the last months, Generative AI (GenAI) has dominated many conversations about teaching and specifically ChatGPT is mentioned very often. There has been so much new information every day with different details about ChatGPT that for some time I tried to ignore it, burying my head in the sand. If you are like me, and you are getting ready to teach a class this fall, this would be a great time to consider taking a different approach. Instead of running away from it, take some time to read this Guide from the Hub for Teaching and Learning Resources around Gen AI at UM-Dearborn

Although this is not new technology, there is a lot of excitement lately around ChatGPT and other large language models (LLMs), but what is new is that they are more accessible. Although it is expensive to run it, ChatGPT is free to use at the moment, which encourages students to use it, especially when UM has its own version in house: UM-GPT.

As with any technology, there are some misconceptions. While it is great that LLMs can generate text that might pass as written by humans, ChatGPT and other chatbots can give you false information. This is called artificial hallucination or many people simply call them hallucinations. ChatGPT can blatantly lie, fabricating information about things or about people. Here are some examples:

Ian Bogost describes some of the problems with ChatGPT in this article: ChatGPT Is Dumber Than You Think. Treat it like a toy, not a tool. 

Some institutions have completely banned GenAI for fear that students will cheat, while other institutions let faculty decide. At UM-Dearborn allowing (or forbidding) use of GenAI is decided by the instructor. However, banning a tool is one thing and enforcing this rule is a completely different situation. If you are going to ban a tool you should definitely find a way to enforce the rule.

The first question you should ask yourself is, how good are the tools that promise to tell you if students used ChatGPT? The answer is not so simple. There are some options that claim it can detect the use of ChatGPT, but considering that these tools might give you a false positive, we should be careful. Getting false positives means you will accuse some students of cheating when they did not use ChatGPT. One thing you definitely should not do is ask ChatGPT if your students got help from ChatGPT, because you could get the wrong answer. Perhaps you are familiar with the story about a professor that failed an entire class accusing the students of using ChatGPT when they did not. 

What to do about ChatGPT then? Are faculty supposed to ignore it? Definitely ignoring is not a solution. Take a position about the use of GenAI before having your first class. If you decide to treat it like a toy and allow use of ChatGPT, give your students some guidelines. For example: What would be appropriate uses of ChatGPT for your class? What can your students do with ChatGPT that might help them start an assignment?

You can prepare a syllabus statement about the use of GenAI. In addition to this, it is very important that you discuss your position with your students. It’s a good idea to put it in writing, but this is not a substitute for talking to your students about it. Communicate with your students about academic integrity and ChatGPT on your first day of classes. Talk to your students about the importance of learning what you teach and continue these conversations throughout the semester. 

Make an appointment with a Hub instructional designer if you want to continue the conversation or if you are interested in getting tips to modify your assignments. You can schedule a time that works for you using our Calendly Scheduler. 

Many thanks to my colleague Autumm Caines who introduced me to ChatGPT and GenAI.

Image by Gerd Altmann from Pixabay.