Correcting Automatic Transcripts of Curricular Video Recordings at UICOM

Around 2019, some of our international students at the University of Illinois College of Medicine (UICOM) met with me to talk about the automatic transcripts created for all of our video content in our video storage system, Echo360. We had turned on the transcripts feature in spring 2019 as a pilot to determine the accuracy of the built-in automatic speech recognition (ASR) in the program and figure out how to best provide students with additional ways to consume the videos.

What we discovered pretty quickly is the accuracy was sometimes really poor. After discussions with Echo360 representatives, we found out that there is not a built-in dictionary for medical, pharmacologic, and certain higher level scientific terms. For example, if a faculty member said ‘hypoallergenic’, the system might not have heard it correctly and the transcript would show ‘hyper-allergenic’ which is incorrect.

Initially, some of these brave medical students tried to correct the transcripts themselves and soon discovered just how long it can take to correct them. We determined that a better option, until Echo360 has the dictionaries in their system, was to hire non-pre-med students to edit the transcripts. I worked with my colleague Dr. Elizabeth Balderas to interview and hire 5 students to do this work. This is still a work in progress and we hope that eventually the company will incorporate a medical dictionary.

Here is an example of how the system incorrectly heard the term ECG from a faculty member who was speaking. The correct term is ECG. After listening to this part of the lecture. it was very clear he said ECG, not eggs.

It is really imperative that we correct these transcripts. There are students who have learning disabilities or other accommodations. Sometimes we have faculty who have thick accents because English is not their first language and we have students who’s first language is also not English. We need to be able to provide the best education to all students.

Ally Implementation for Accessiblity

Ally is a product for accessibility that we are using at the University of Illinois (UIC) to ensure our content is accessible to all people, regardless of disability. A pilot was implemented in fall 2019 and included instructional designers from all colleges with faculty and students to test it out. Ally uses machine learning algorithms to check uploaded documents and makes suggestions for how to improve them.

Here is an example of what instructors and instructional designers see when they use Ally to check their documents.

The odometer to the left of each item reflects the level of accessibility. Green means a document is accessible, while orange and red mean it is not accessible.

In this example, Ally tells us that the PowerPoint slide deck that was uploaded is missing descriptions for many images in the deck. The red box around the image helps the instructor or instructional designer to take action and provide a description.

It was implemented in 2020 after a lengthy pilot period in 2019 and has been instrumental in order to ensure our materials are accessible. It requires continuous faculty development as well as working closely with coordinators, as they are usually the ones uploading documents, especially as there are enhancements to it.

Here is some information about the Ally File Transformer: