Estimated reading time: 8 minutes
Note: This post is part of our recently initiated Digital Detox series, which is designed to be in conversation with the Digital Detox resources at Middlebury College and Thompson Rivers University.
Teaching online (especially during a pandemic) can sometimes feel lonely. During past teaching experiences, I got bogged down by some of the uncertainty of online teaching, with endless questions floating around in my head: Are my students watching the videos I painstakingly script, record, and caption? Are they even helpful? Do students ever click the links in my posts and announcements, and are those extra resources of interest to them? How long does it take my students to do their readings and to complete assignments? As an instructional designer, I’ve also consulted with many faculty members who have these same questions.
To be concerned about the student experience in a course is a mark of a considerate teacher, and taking steps to better understand how students use and interact with course materials and whether these materials are useful to them is absolutely worthwhile. This post is concerned with some of the problems associated with collecting this information by tracking students’ online behavior. Users of Canvas, our campus’s Learning Management System (LMS), may know that there is a built in feature called “Analytics” that tracks where students click and spend time on course sites.
But what can clicks really tell us about student learning? Brenna Clarke Gray from Thompson Rivers University recently got me thinking about this question in her Digital Detox post about the LMS. We need to critically examine what these tools can tell us and what they can’t, and seriously consider how using them can “change classroom dynamics for the worse” as Brenna puts it. I’d like to take a “harm reduction” approach to this conversation, drawing on ideas from substance use education and sex education. If we are dealing with a potentially harmful or risky activity, it makes sense to equip people with knowledge about that activity rather than assume that they will do it safely or responsibility or that they will choose to abstain from it entirely (think of “abstinence only” sex education – as Autumm wrote in our first Digital Detox post, this is not an abstinence only conversation). Relying on the Canvas analytics feature (or any other learning technology) to gauge student engagement, participation, or interest is a risk because it equates a short list of online behaviors to complex elements of a learning experience. Instructors’ interpretation of these metrics can introduce bias (“Student A is working harder because they spend more time on the course site than Student B”). Classroom climate and access to learning can also suffer if instructors require students to document their efforts in a way that the analytics feature can register (“Please view this lecture video in the Canvas page so that I can check that you watched it”). These kinds of requirements can reinforce harmful power dynamics between students and teachers, making students feel as though they are constantly being watched.
These potential harms warrant at least a detailed investigation of how Canvas analytics works and a discussion of how it should be used responsibly, if at all. What follows is a non-exhaustive inventory of the Canvas analytics metrics, some thoughts on how they are best interpreted, as well as some ideas about the merits of abstaining from using analytics all together. All of the information below is drawn from the Canvas instructor guides for New Analytics.
What Canvas analytics can tell us
- When students log into the course site and the total time that the window is open on their computer. Note that this does not indicate time they were actively engaged with materials, rather just that the page was open on their computer.
- How many times a student views course pages. Again, these do not correspond to activities such as reading or watching, only to clicks. By viewing the detailed analytics data, you may be able to see how long a given page was open, but this can always be idle time.
- When students submit assignments, and how often these are “on time” according to the due date for the assignment. Note that an assignment would count as late if it is submitted even one minute past the due date.
- How often students engage in certain activities that Canvas calls “participations”. These include submitting an assignment, posting a new comment to a discussion, and starting or submitting a quiz.
What Canvas analytics cannot tell us
- How much students study or work on course activities. Students can always download PDFs or videos to watch in another format rather than watching them on Canvas, in which case Canvas analytics will not register them as having spent time reading or watching videos. Canvas logs the number of hours spent on the course site or individual pages, but that can include idle time (e.g. displaying a page on a monitor but not looking at it).
- How much students “participate.” As we saw above, Canvas uses the term “participations” to refer to a short list of actions that students may take while interacting with a course site (as do other LMSs). However, class participation is a much broader and complex category that instructors consider in assessment, self-reflection, and grading. Many instructors even base part of a student’s grade on “class participation,” which has some pros and cons. Canvas will show you a “participation” metric for an individual student, but don’t mistake this as an easy way to assign participation grades. As always, any instructor that wants to consider student participation in assessment or their own reflection will have to observe the relevant types of participation themselves.
- Whether students are engaged or enjoying the course. Clicks and views do not necessarily translate to engagement or satisfaction with the course. The only way to know this information is to ask students how the course is going for them! One way to do this might be to take advantage of the Hub’s midterm feedback consultations.
Paths forward: Disclosure, abstinence, and how to really check in with students
Since this is not an “abstinence only” conversation, I recognize that some readers will still want to use analytics for some reason or another, even in light of the information above. Brenna reminds us in her post that it is crucial to be transparent with students about how you use the Canvas analytics. I see this transparency as two-pronged: Students should know what instructors can see, and also what they plan to look at for any reason over the course of the semester. For example, you could tell your students that you have the ability to see all of the metrics mentioned above, but that you only plan to use that information to check that all students have been able to log into the course during the first week of class. If you plan to use the analytics information in other ways, particularly with respect to assessment, it is important to disclose that to students early on (but beware of how this may influence student behavior). Another small way to reduce potential harms and biases when viewing analytics is to only view data in aggregate and never at the level of the individual student.
I want to also suggest that there may be some very significant benefits to abstaining from using course analytics data all together. We will probably still have questions about how our students are experiencing the course, but we can always gather this information by talking to students directly, either on a one-on-one basis or through a survey or other instrument. If our goal is truly to understand students’ experience with the course, gathering feedback and responding to it thoughtfully is a way to get the information we need and also build trust with students. Another reason for abstaining is that student concerns about privacy are already high, with many students learning at home with limited private time and space to learn. Proactively letting students know that you won’t be using course analytics may remove a source of anxiety while they are already learning under highly stressful conditions. No technology, including a Learning Management System, is neutral. As the analytics tool continues to evolve with new Canvas updates, there may be new concerns about student privacy, and new more or less responsible ways of using the data that it provides.
I hope to continue the conversation in future Digital Detox posts about “harm reduction” with our Canvas LMS features and how we can work to build trusting and transparent digital learning environments. A small number of us who have signed on to the UM-D Digital Detox theme are going to gather virtually at 9am ET on Friday February 12th to discuss further. Register for the zoom session if you would like to join!
Photo by Lianhao Qu on Unsplash
Reading time estimation from https://niram.org/read/
Sarah Silverman is an Instructional Designer in the Hub for Teaching and Learning Resources at the University of Michigan – Dearborn. You can read more about Sarah on her author page.