I read a great case study this morning out of New York City’s Rochester Institute of Technology which has 1500 dear of hearing impaired students, and are now using Microsoft Translator to provide real time captioning and translation to support the American Sign Language (ASL) translators in the lectures.
I’ve blogged about this before here and here and also wrote an article about the impact of Artificial Intelligence (AI) more generally in the classrooms for the Interface Magazine that you can read here.
This case study is really timely as I was in a school in Wellington yesterday working with a teacher who has a profoundly deaf student in her class and having the ability to use something like Translator will certainly add another layer of information for the student in the absence of a full time sign language translator.
The conversation in the classroom yesterday, and the case study above, reminds me again of Microsoft’s commitment to accessibility and the Seeing AI app is another great example of this that I was able to share with a teacher yesterday who has a blind student in her class:
In my experience, the Microsoft Translator works best if the presenter/speaker is wearing a mic close to their mouth for the most accurate detection of their speech. Using the default in-built mic on a laptop generally has too much ambient noise and can reduce the accuracy and quality of the transcription and consequently the translation as well if this feature is being used.
If you’re using this in your classes, I’d love to hear how the experience is going so drop a note in the comments below.