AI Technology Assisting Deaf Students

I read a great case study this morning out of New York City’s Rochester Institute of Technology which has 1500 dear of hearing impaired students, and are now using Microsoft Translator to provide real time captioning and translation to support the American Sign Language (ASL) translators in the lectures.

Read the entire case study here

I’ve blogged about this before here and here and also wrote an article about the impact of Artificial Intelligence (AI) more generally in the classrooms for the Interface Magazine that you can read here.

This case study is really timely as I was in a school in Wellington yesterday working with a teacher who has a profoundly deaf student in her class and having the ability to use something like Translator will certainly add another layer of information for the student in the absence of a full time sign language translator.

The conversation in the classroom yesterday, and the case study above, reminds me again of Microsoft’s commitment to accessibility and the Seeing AI app is another great example of this that I was able to share with a teacher yesterday who has a blind student in her class:

Microsoft

Joseph Adjei, a first-year deaf student from Ghana loves Microsoft Translator

In my experience, the Microsoft Translator works best if the presenter/speaker is wearing a mic close to their mouth for the most accurate detection of their speech. Using the default in-built mic on a laptop generally has too much ambient noise and can reduce the accuracy and quality of the transcription and consequently the translation as well if this feature is being used.

 

If you’re using this in your classes, I’d love to hear how the experience is going so drop a note in the comments below.

I am always keen to discuss what I've written and hear your ideas so leave a reply here...

Discover more from SamuelMcNeill.com

Subscribe now to keep reading and get access to the full archive.

Continue reading