What’s that you say? Present with captions in Google Slides
Years ago in a Long Island doctor’s office, four-year-old Laura was fitted with her first pair of hearing aids, customized to compensate for her specific hearing loss. However, they didn’t work very well, particularly in noisy backgrounds, so she eventually stopped wearing them.
A few years later on a school bus in Bethesda, MD, nine-year-old Abigail sat next to a classmate who taught her how to communicate using American Sign Language. In high school, she worked in a biology lab at the National Eye Institutewhere she researched retinitis pigmentosa, a genetic disorder that causes loss of vision.
Flash forward to today where we, Laura and Abigail, work at Google, building products with accessibility features that help billions of users across the globe. We met earlier, through the accessibility community at MIT, where we studied computer science with the hopes of using our technical skills to make a difference in people’s lives.
During our time at university, Abigail built a solution that helped a blind man use his touch-screen oven, led a team that enabled blind individuals to sign legal documents independently, and co-founded an assistive technology hackathon. Laura researched a new signal processing algorithm for hearing aids in noisy environments, built an app for residents in a neurological disease care facility to call for help in a more accessible way, and worked on a hands-free page turner for individuals unable to use their arms. This work not only made us see what an impact technology can make on people with accessibility needs, but also motivated us to focus our careers in this area when we graduated.
When we landed at Google, we both independently joined the G Suite accessibility team. As part of this team, we’ve improved screen reader, Braille and screen magnifier support on Google Docs, Sheets and Slides, and we have represented the Google Accessibility team at external conferences. We’re also involved with the American Sign Language community at Google, which promotes inclusivity among all Googlers through shared language.
Recently, an internal hackathon led us to work on a project that is deeply personal. Upon observing that presentations can be challenging for individuals who are deaf or hard of hearing to follow along, we both teamed up with the idea to add automated closed captions to G Suite’s presentation tool, Google Slides.
This work has moved from a passion project to our full-time job, and today we’re officially launching automated closed captions in Google Slides. The feature will gradually roll out to all Slides users starting this week.
How it works
The closed captions feature is available when presenting in Google Slides. It uses your computer’s microphone to detect your spoken presentation, then transcribes—in real time—what you say as captions on the slides you’re presenting. When you begin presenting, click the “CC” button in the navigation box (or use the shortcut Ctrl + Shift + c in Chrome OS / Windows or ⌘ + Shift + c in Mac).
As you start speaking into your device’s microphone, automated captions will appear in real time at the bottom of your screen for your audience to see. The feature works for a single user presenting in U.S. English on a laptop or desktop computer, using the Chrome browser. We’re looking to expand the feature to more countries and languages over time. The captions are powered by machine learning and heavily influenced by the speaker’s accent, voice modulation, and intonation. We’re continuing to work on improving caption quality.
Closed captioning in Slides can help audience members like Laura who are deaf or hard of hearing, but it can also be useful for audience members without hearing loss who are listening in noisy auditoriums or rooms with poor sound settings. Closed captioning can also be a benefit when the presenter is speaking a non-native language or is not projecting their voice. The fact that the feature was built primarily for accessibility purposes but is also helpful to all users shows the overall value for everyone of incorporating accessibility into product design.
You might think that the experiences we had growing up are the reasons we were inspired to work on accessibility at Google. That’s partly true. But we really got into this work for its potential to improve the lives of people with disabilities, for the interesting technologies and design constraints, and because of our desire to use our skills to make the world a better place. We’re excited to contribute to that effort with closed captions in Google Slides, and we’re eager to share it with you. Visit our help center to learn more.
Related Google News:
- Scaling deep retrieval with TensorFlow Recommenders and Vertex AI Matching Engine May 1, 2023
- Unleash your Google Cloud data with ThoughtSpot, Looker, and BigQuery May 1, 2023
- Track, Trace and Triumph: How Utah Division of Wildlife Resources is harnessing Google Cloud to… May 1, 2023
- Seeing the World: Vertex AI Vision Developer Toolkit May 1, 2023
- BBC: Keeping up with a busy news day with an end-to-end serverless architecture May 1, 2023
- Scalable electronic trading on Google Cloud: A business case with BidFX May 1, 2023
- Google Cloud and Equinix: Building Excellence in ML Operations (MLOps) May 1, 2023
- Google Docs can make a table of contents for you — here’s how May 1, 2023