Skip to content

Recording Available: Bonnie Stewart Keynote

On February 15th Bonnie Stewart gave the closing keynote presentation for our Digital Education Days – a collaboration between University of Michigan – Dearborn, University of Michigan – Flint, and Henry Ford College. I had the honor of introducing Bonnie and of writing up this blog post announcing the recording of the event.

I hope that you find some time to watch Bonnie’s whole talk but if not (or if you are just more in the mood for reading than watching) I wanted to give a little bit of a recap/reflection.

Bonnie took us back to the height of the MOOC craze to reflect on some of the hype. Claims that MOOCs would take over and that there would only be 10 universities in the world proved to be false. But some elements of that time have (unfortunately) stuck with us. Ideas that higher education is ripe for disruption, that there is a business model somewhere/somehow tied to that disruption, and that silicon valley has a place in that disruption are still alive and well.

The pandemic amplified this idea that digital tools in education could be potential profit spaces. A big part of those business models are rooted in datification through technological systems which are supposed to serve education. All of our digital tools – be they provided by the university or not – leave data trails. Data trails can optimize learning but rarely in isolation and often things are more complex than the sales materials say. Bonnie outlined several problems with datafied systems.

1 – Often these systems tell us things we already know. Here Bonnie had a great image of some bananas in a grocery store and the sign reads “Boneless Yellow Bananas” and lists the price. You don’t need to say the bananas are boneless – most people know that. Even in a large course, engaged professors don’t need a system to tell them if a student is engaged through a report outlining how long they stayed on a page. Professors know this because they see evidence (or lack thereof) in the student work.

2 – The benefits of education are not just in things that we can measure and when we allow systems to measure learning we run the risk of only valuing the things that the system deems important. Here Bonnie put up a tweet from children’s author Michael Rosen @MichaelRosenYes which read “First they said they needed data about the children to find out what they were learning. Then they said they needed data about the children to make sure they were learning. Then the children only learned what could be turned into data. Then the children became data“.

3 – Plainly put, the third reason is that a lot of these systems are racist. Bonnie gives the example of Colin Madland a Ph.D. candidate who works with faculty to help them with their courses. Colin tweeted about a case where a faculty member was having trouble with Zoom’s virtual backgrounds. Every time the professor turned on the virtual background it would erase their head! After some digging, it was determined that the algorithm could not recognize the faculty member’s face because it was trained to only recognize white faces. This case was made worse by the fact that when reporting this on Twitter Colin found that no matter where he placed his own white face in the image demonstrating this issue, that Twitter would highlight only his face in the mobile app – something called cropping bias. The whole incident is outlined in this TechCrunch article if you would like to dig deeper into this incident.

Bonnie goes on to outline her own work asking questions about how much faculty know or understand about these systems. In 2020 she and her co-researcher Erica Lyons issued a broad survey asking faculty some basics around data practices and knowledge. For instance, how often do you read the terms of service (TOS) on technologies you use for your courses and do you know where the servers for your institution’s LMS are located? The answers are about what you would expect – 65% not knowing what country their LMS’s servers are located in and less than 10% of respondents said they read the TOS 90% of the time or more before using a technology in their courses. (Note: Bonnie is really clear here that this is NOT the fault of the faculty and that these documents are not meant to be read by those of us who are not lawyers.)

Bonnie ends with a call for sector-wide data ethics practices. She points out that there was a time that we did not have research ethics and we saw really problematic designs such as the Milgram experiments and the Stanford Prison experiment. It was not until all research institutions started to concern themselves with things at the ethics level that we began to see closer scrutiny.

My little summary here really does not do the whole talk justice and I highly recommend you take the time to watch the video.