Building the inclusive workplace we imagine, together

Today marks Global Accessibility Awareness Day, one of the days we get to celebrate the progress our customers and partners have made to make their workplaces more inclusive, and then look ahead to what more we can do as a community to empower everyone in the workplace.

We celebrate because our customers are empowering their employees both with the accessible technology built into Microsoft 365, and with the inclusive cultural practices that make people love coming to work. We are comparing notes and learning from them as we also build that same inclusive culture at Microsoft.

Here in the U.K., we recently celebrated a milestone—a point on our journey—with recognition from the U.K. Government as a Disability Confident Leader. This status is awarded to organizations that commit to diversity and inclusion and encourage suppliers and vendors to do the same. Our team has worked tirelessly to put processes in place that can both create and sustain a diverse and inclusive culture; attracting and recruiting people with disabilities via our global Inclusive Hiring Program; training managers to understand the needs of those with visible and non-visible disabilities; assessing people for roles more flexibly so those with disabilities have the best opportunity to show their skills; adjusting workplaces to include sign language interpreters, and ensuring all staff have access to disability equality awareness training.

Looking ahead

We also look ahead to our big vision—an accessible and inclusive workplace for everyone—and what more we can do as a company and as a community to make it a reality. Today we’re excited to announce that live captions and subtitles in PowerPoint are rolling out now, and will soon be generally available to Microsoft 365 and Office 365 subscribers worldwide for Windows, Mac, and the web. We also look forward to the coming release of other new inclusive technologies built into Microsoft 365, like live captions and subtitles in Teams Meetings.

Present inclusively with live captions and subtitles in PowerPoint—We know how powerful a great presentation can be—whether it inspires us or aligns us to a common goal. Now, with support for 12 spoken languages and 60+ on-screen captions or subtitle languages, people who are deaf or hard of hearing can be included in these important team building moments. Additionally, with an increasingly global and remote set of collaborators, those who speak a different language from the presenter, or are listening in from a loud environment, can also more easily be included.

Transform the meeting experience for people with disabilities—We also know the critical role meetings play in how we work, and recently announced that live captions and subtitles will also be available in Teams Meetings. These capabilities are coming soon as a preview in English and complement the captioning and transcription features already generally available for recorded Teams meetings and live events in Stream, Teams, and Yammer. Whether in a 1-1 with your manager, or a company-wide all hands, everyone should feel included when the team gets together to meet, including people who are deaf or hard of hearing.

We constantly release new features and improvements to make our products not just compliant with the latest standards, but empowering for all users, both with and without disability. We encourage you to read all about these features in the Microsoft Accessibility Features Sway.

Building the inclusive workplace together

Many of our customers are committed to making this vision of an inclusive workplace a reality and are partnering with us to make it happen. Last month the Federal Government of Canada chose Microsoft as a partner in their effort to create a more modern and accessible Public Service. The Honourable Carla Qualtrough, Minister of Public Services and Procurement and Accessibility, said, “Equipping our public servants with accessible, reliable, and innovative technologies will unleash the potential of our world-class public service and result in better service delivery for all Canadians.” Here at Microsoft we agree—only when we represent the diversity we see in the world internally can we build the most innovative technology and serve our customers as they ought to be served.

We also see Rogers Communications, a leading Canadian communications and media company, shares our vision of a more inclusive workplace. Rogers is doing everything from transforming their physical workplace to be more collaborative and inclusive, to using the accessible technologies built into the Microsoft 365 applications their employees can use every day. Best of all, we’re helping and learning from each other along the way—our teams work regularly with Rogers to understand how our technology can better support their commitment to building an inclusive workplace, and Rogers’ Persons with Disability Diversity group works with us to learn how we embed Inclusive Design principles into our products and our culture. Read the full blog, published today, to learn more about how Rogers is building an accessible and inclusive culture to benefit employees, customers, and the broader community.

Join us!

We have so much more to do—as an organization, an employer, a leader, and a follower—in this journey towards an accessible and inclusive workplace, and we hope you’ll join us. Visit the Microsoft Accessibility site to learn more about our approach. Share your learnings with #LearningTogether and #GAAD and continue the conversation with @MSFTEnable on Twitter.


10 ways Microsoft tools can help you build a classroom that works for every student

In today’s classroom, diversity is the new normal. Teachers don their superhero capes every day, going to extraordinary lengths to reach every one of their students, from creating inclusive curriculum in core subjects like reading, writing, and math, to enabling every student to have a voice. We’re honoring their work, and highlighting some tools to help, in this month’s episode of What’s New in EDU.

At Microsoft Education, we work to support teachers in their mission to create an inclusive classroom for all students.  Here are 10 ways our tools support learning across unique needs and abilities.

  1. Understand word meanings more easily and improve vocabulary

Seeing a word and attaching meaning to it involves a number of cognitive processes. We’re trying to support students learning to make those connections with Picture Dictionary and Read Aloud in Immersive Reader. Select a word and Picture Dictionary will show you a descriptive image, even providing multiple images for words with more than one meaning. Read Aloud connects the text to students with visual impairments and helps with pronunciation practice. Providing visual and audio inputs gives all students, and especially students with dyslexia, the multi-sensory experiences they need to ingrain that word into their vocabulary.

Try this: Next time you give a vocabulary quiz, try providing the list of vocabulary words in OneNote. Show students that they can click Immersive Reader, then click the vocabulary word to see a picture of what the word means and have it read aloud.

  1. Make it easier to focus on reading

With the media multitudes that surround students, it’s not always easy to prevent distractions online and across devices. Immersive Reader’s flexible text sizing, line focus, and background color options make any document, notebook or web page focus friendly. This is particularly helpful for students with ADD and ADHD as well as for students with dyslexia.

Try this: Next time you assign reading to be done from a device, show students how to select Immersive Reader in OneNote, make the font bigger, and select line focus mode. Learn more about Learning Tools like Immersive Reader!

  1. Improve pronunciation of longer words

We know a time-tested tactic is breaking up words into syllables and sounding them out. Now, students have a tool that will do so automatically, helping them to nail the pronunciation. Students can even check their pronunciation by selecting Read Aloud and seeing how close they were. This is particularly helpful for students with dyslexia who often have trouble matching letters to sounds.

Try this: Next time you assign reading to be done at home, instruct students to break the words into syllables in Immersive Reader or, if they can’t remember how to pronounce them, to use Read Aloud.  

  1. Understand grammar and sentence structure more quickly

Understanding parts of speech is critical for developing reading fluency. Immersive Reader can help by labeling or highlighting nouns, verbs, adjectives, and adverbs. This supports all students, especially those with dyslexia, as they develop their ability to find patterns in words.

Try this: Next time you assign grammar practice, let students know they can check their work by selecting the Parts of Speech toggles in Immersive Reader.

  1. Empower students to improve the quality of their writing

When you spend a long time writing, you want to make sure the final work is polished. Read Aloud in Immersive Reader allows you to have the document you’ve written read out loud, so you can more easily catch mistakes. Editor in Word helps students identify misspellings, provides synonyms for those misspelled words, and offers the option to have the suggested spelling correction and synonyms read aloud. This all helps students with dysgraphia who have a hard time reviewing their own written work.

Try this: During the editing and revising process, encourage students to use Read Aloud to listen to their work read back to them. This will help them identify revisions and improve their writing!

  1. Make it easier to start writing, and kick writer’s block

We’ve all stared down an empty page in fear wondering how we’re going to fill it with beautiful writing. With Dictate, in OneNote and Word, students can have their speech turned to on-screen text. This is especially helpful for students with dysgraphia who struggle with writing.

Try this: When students are having trouble getting started, encourage them to turn on Dictate, then brainstorm out loud. Just getting some ideas and words on the page will build momentum and help them conquer the blank page! Check out more ideas for utilizing Dictate in the classroom!

  1. Break down the language barrier

Students can use all the same tools above when they learn their first language—and then when they learn a second language! With document and word translation in Immersive Reader, you could start with a text in Spanish and translate either individual words or the entire document into English. This is helpful for students with dyslexia, who are learning new languages, and ESL learners, who can match the words they know in their first language with their second language more easily than ever before using sounds, pictures and text.

Try this: When you assign passages for reading, put a copy in OneNote and show students they can translate either by word or document in Immersive Reader.

  1. Help students read, understand steps, and show their work in math

Math is all about showing your thought process and the steps you took to get to the answer. Math Solver shows students the steps to solve a math problem, giving a clear model for how to show your work. The Immersive Reader can also read the math equation notation, as well as the step-by-step instructions in Math Solver, aloud for students. This helps students with dyscalculia break down math problems and learn what to do with similar problems next time.

Try this: If a student is having trouble with a particular type of problem, encourage them to use the Math Solver to insert the steps into their OneNote page. They can reference the steps as they work on similar problems, helping them follow the same solution process but applying it to new equations.

  1. Present to students, parents, and your colleagues inclusively

When you give a presentation to students, parents, or other teachers (or when teaching students to present), make sure to turn on live captions and subtitles in PowerPoint. Live captions help students with hearing impairments, or those who speak other languages outside the classroom, to follow along with the presentation.

Meeting remotely? Connect with parents or colleagues online in a Teams meeting, and turn on live captions to make sure no one misses a moment, whether it’s a global PLC meeting or an online parent conference.

Try this: Use PowerPoint live captions and subtitles during your next parent-teacher conference. Those rooms get packed, and parents will appreciate being able to see captions. They can even download the Microsoft Translator app and translate it into the language they use most often.

  1. Build student empathy with Minecraft: Education Edition

Minecraft: Education Edition offers several features that support inclusive learning, from classroom multiplayer for better collaboration, to customizable game settings including a text-to-speech user interface. As New York City special educator and STEM coach Sean Arnold writes in this EdSurge article, “chat features are enabled with speech-to-text functionality, which lets struggling readers and writers participate with the community at their own pace.” Minecraft: Education Edition gives students with physical and intellectual disabilities the opportunity to be creative, explore without fear of failure, and feel a sense of autonomy in the classroom. Arnold explains, “my students were no longer confined to wheelchairs or leg braces; they could walk, create and even fly. It’s a world where they are free from ridicule, free from their real-world struggles and free to create a world that they desire.”

We know that better student outcomes, teacher time, school budgets, and IT staff workloads are top of mind for every school district and school leader. That’s why we partnered with Forrester Consulting to do a total economic impact analysis around Microsoft assistive technologies for education. Informed by interviews across four Microsoft 365 (M365) districts using our accessibility tools, the findings pointed to three key benefits: improved student learning, reduced cost and effort, and saved time and increased effectiveness.

This report is available to download and share in your district. We also have a deeper dive into the data available on our Tech Community blog. With the tools built into the M365 accessible platform, you can help improve learning outcomes for every student while also saving real dollars in your school budget.

Eager to explore Microsoft accessibility tools in your own classroom? Get started with Office 365 Education for free!

And don’t miss next Tuesday’s #MSFTEduChat TweetMeet, where we’ll be discussing inclusive classrooms and accessibility with a global community of educators.


Noise-cancelling headphones, smart glasses: how technology is making museums more accessible

Museums are places for people to immerse themselves in culture, as well as learn, create, share and interact.

Being accessible — designed for everyone — is one way museums can maximize that role, and a growing number are working hard to do just that to serve the more than  one billion people worldwide experience some form of disability.

Here is how technology is helping museums get closer to the communities they serve.

Noise-cancelling headphones

We don’t all experience the world in the same way — everyone is different. People with autism, for example, may find certain situations cause a sensory overload.

New York’s Intrepid Sea, Air & Space Museum offers noise-cancelling headphones for people who might have auditory over-stimulation. This museum also helps parents of children with sensory processing disabilities plan their visits by emailing them images and illustrations in advance.

Museums in Chicago (including the Shedd Aquarium, the Field Museum and the Chicago Children’s Museum) also help visitors plan their trips through an app that highlights exhibitions that are sensory friendly.

[Subscribe to Microsoft On The Issues for more on the topics that matter most.]

Audio descriptions

Statue and El Prado Museum

Tactile displays and audio descriptions can help bring museum experiences to life.

The Smithsonian museums in Washington, D.C., are giving visitors who are blind or with low vision a rich and rewarding experience through their smartphones or smart glasses. Using a video-streaming service, users are connected to an “agent” who provides a bespoke, detailed description of their surroundings.

The use of Braille descriptions has become increasingly common in museums around the world, and one Spanish institution has improved upon that. Madrid’s Prado Museum has made parts of its collection tactile, allowing visitors to be hands-on with the exhibitions.

The Louvre in Paris, and the Guggenheim Museum and the Metropolitan Museum of Art in New York, have all established tactile tours, where visitors can touch the art on display or touch casts of well-known works.

Hearing loops

Field Museum of Natural History

Tools such as hearing loops — also known as audio induction loops — use wireless signals to transmit audio directly to someone’s hearing aid and can be used in a variety of settings, including museum exhibitions. The Met in New York is just one example of this.

Another New York museum, the Whitney Museum of American Art, has been trying something different. It has developed a series of vlogs, or video blogs, with messages, explanations and exhibition information in sign language.

As well as opening up the museum’s content to visitors with hearing loss and deafness, the museum, on its website, says it hopes to “create a communications laboratory to expand the ASL vocabulary of contemporary art terms,” referring to American Sign Language.

The Dutch Rijksmuseum believes everyone should be able to access information on the art in their own language. It recently launched a video tour in Dutch Sign Language integrated in its app. The tour has been set up in close collaboration with and by deaf entrepreneurs.

Immersive experiences

Rocket Garden at Kennedy Space Center

A few years ago, the Pokémon Go craze took off, introducing many people to the possibilities of augmented reality. By creating immersive experiences, AR and other technology is being used to reimagine the way visitors relate to museums and historic sites.

You can take an AR tour of Pompeii, where a headset will put you right in the heart of the vibrant Roman city that was destroyed by the catastrophic eruption of Mount Vesuvius in A.D. 79.

Visitors to Bone Hall, in the Smithsonian National Museum of Natural History in Washington, D.C., meanwhile, can use AR to view the exhibits in a new light seeing the skeletons appear as living creatures.

The Petersen Automotive Museum in Los Angeles is using technology to bring cars from Hollywood alive with a mixed reality exhibition using Microsoft’s HoloLens technology. The “Worlds Reimagined” experience explores classic and futuristic cars from films and video games, including “Back to the Future” and the video game franchise “Halo.”

Other museums are using this technology to bring new experiences to their patrons including the Intrepid Sea, Air and Space Museum in New York with “Defying Gravity”; and the Museum of Flight’s mobile VR experiences in Washington state. The Musée des Plans-Reliefs in Paris used AI to create a digital twin of the historic Mont-Saint- Michel, which had to be captured from every angle.

In 1962, President John F. Kennedy captured the Space Race zeitgeist, when he said: “We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard.” The Kennedy Space Center in Florida uses immersive technologies to recapture that energy, excitement and enthusiasm. At its “Heroes & Legends” exhibition, visitors can experience spacewalks, look inside space capsules and feel close to the action.

By bringing the past to life in a way that adds richness and depth, and, of course, accessibility, technology is helping museums reach a wider audience.

For more on these innovations and on accessibility initiatives at Microsoft, visit and follow @MSFTIssues


Inclusive classrooms and accessibility—join the global #MSFTEduChat TweetMeet on May 21

Announcing the May 21 TweetMeet on ‘Inclusive classrooms and accessibility.’

Change starts with awareness. Every third Tuesday of May is Global Accessibility Awareness Day (GAAD). The purpose of GAAD is to get everyone talking, thinking and learning about people with different abilities and their accessibility and inclusion in a digital world.

Our mission is to empower every student on the planet to achieve more, which stems from the belief that every student deserves the opportunity to fulfill their potential.

In this edition of our monthly global and multilingual Twitter conversations, we’ll discuss ways in which educators around the world make inclusion and accessibility an integrated part of their classrooms.

Keep reading for detailed information about this TweetMeet.

Language tracks and SuperSway

We offer seven simultaneous language tracks this month: English, French, Spanish, Italian, Polish, Swedish and Vietnamese. The new SuperSway offers a TweetMeet Invitation in each of these languages.

For each language track, we have one or more hosts to post the translated questions and respond to educators. As always, we’re super grateful to all current and former hosts who are collaborating closely to provide this service.

The #TweetMeetXX hashtags for non-English languages are to be used together with #MSFTEduChat so that everyone can find the conversations back in their own language. For example: Spanish-speaking people should use both #TweetMeetES #MSFTEduChat. English-speaking educators may use #MSFTEduChat on its own.

TweetMeet Fan? Show it off on your Twitter profile!

Every month more and more people discover the unique flow and characteristics of the TweetMeet events and become excited to participate.

Show your passion for the TweetMeets right from your own Twitter page by uploading this month’s #MSFTEduChat Twitter Header Photo to the top of your own Twitter profile.

In the same file folder, the Twitter Header Photo is available in many other languages and time zones.

Looking back on the April TweetMeet on ‘Teaching Happiness’

The April #MSFTEduChat TweetMeet inspired educators around the world to share ideas, insights and resources. We captured highlights from this Twitter conversation in this @MicrosoftEDU Twitter Moment.

Why join the #MSFTEduChat TweetMeets?

TweetMeets are monthly recurring Twitter conversations about themes relevant to educators, facilitated by Microsoft Education. The purpose of these events is to help professionals in education to learn from each other and inspire their students while they are preparing for their future. The TweetMeets also nurture personal learning networks among educators from across the globe.

We’re grateful to have a support group made up exclusively of former TweetMeet hosts, who volunteer to translate communication and check the quality of our questions and promotional materials. They also help identify the best candidates for future events, provide relevant resources, promote the events among their networks and, in general, cheer everybody on.

When and how can I join?

Join us Tuesday, May 21 from 10 a.m. to 11 a.m. PDT on Twitter using the hashtags #MSFTEduChat, #inclusion, #accessibility and #MicrosoftEDU (which you can always use to stay in touch with us). Be sure to double-check your own local event time. You can find the event time for 215 countries with this time zone announcer.

Our next recommendation for you is to set up Twitter dashboard TweetDeck and add a column for the hashtag #MSFTEduChat. If you are new to TweetDeck, then check out this brief TweetDeck video tutorial by Marjolein Hoekstra.

When a tweet appears that you want to respond to, press the retweet button and type your comments. Great news is that Twitter now supports adding images, animated GIFs and videos to your comment retweets.

Additional tips are offered in this animated GIF that you’re most welcome to share with newcomers:

Too busy to join at event time? No problem!

From our monthly surveys we know that you may be in class at event time, busy doing other things or may even be asleep–well, no problem! All educators are welcome to join any time after the event. Simply look at the questions below and respond to these at a day and time that suit you best.

You can also schedule your tweets in advance. In that case, be sure to include the entire question in your tweet and mention the hashtag #MSFTEduChat so that everyone knows to which question in which conversation you are responding.

The exact question timings are in this helpful graphic:

Resources to help prepare for the TweetMeet

Microsoft Education offers a wide range of tools, professional-development courses and learning paths about inclusion and accessibility. These resources are tailored for educators and they are all free. Good places to start are:

Microsoft Inclusive Ultimate portal

Microsoft Accessibility overview in Sway format, live-embedded:

Wakelet is a useful web service to bookmark, curate and annotate resources, images, tweets and other content. Mike Tholfsen just created this Wakelet Collection as a handy reference. It currently has 40+ pointers:

Inclusive Classrooms Wakelet Collection, live-embedded:

Discussion Questions

A great way to prepare for the TweetMeet is by taking a close look at the discussion questions. Watch the animated GIF with all the questions:


Meet the 14 hosts for this month’s TweetMeet! They are all passionate about #inclusion and #accessibility and very eager to engage with you.

Check out all the hosts, see what they are tweeting about and consider following them:

List of host names and their profiles

  • Catherine Dourmousi  @CatDourmousi (EFL teacher, Hellenic American Union examiner for Michigan University English-language exams, Microsoft Certified Educator, author, supporter of empathy, mindfulness, and growth mindset in teaching Athens, Greece)
  • Elisabetta Nanni @Bettananni (Music teacher and teacher trainer about ICT, MIE Expert, eTwinning Ambassador with expertise in Microsoft Learning Tools Trento, Italy)
  • Elsbeth Seymour @TeachinEls (MIE Expert, Secondary Special Ed Teacher, using a passion for tech & gaming to connect, support and facilitate learning for neurodivergent students – California, USA)
  • Fabrice Marrou @FabMarrou (French and History teacher in a vocational school, former Microsoft Learning Consultant, ICT trainer – Perpignan, France)
  • Huong Quynh @Quynhth9 (EFL teacher, teacher trainer, passion for exploring ICT in language education – Hanoi, Vietnam)
  • Iwona Cugier @icugier (Teacher and trainer with a focus on ICT and programming, passionate about digitalization in education – Leszno, Poland)
  • Joe Brazier @ManvDadHood (Former Special Educator and EdTech Integration Trainer, Business Strategy Lead at Microsoft focused on the K12 Modern Classroom Experience and Inclusive Education – Kirkland WA, USA)
  • José Carlos Sancho @72Joseca (History teacher, MIE Expert, teacher trainer, ICT coordinator at FEC (Fundación Educación Católica) and passionate about Microsoft Learning Tools – Zaragoza, Spain)
  • Kelli Suding @ksuding (Indiana statewide PATINS specialist of autism, Specific Learning Disabilities (SLD), Chrome accessibility, Accessible Educational Materials (AEM) & assistive technology integration – Indianapolis IN, USA)
  • Martin Howe @Martin_Howe (Teacher with passion for helping students with special needs to develop and reach their goals, preferably using digital tools – Borlänge, Sweden)
  • Mike Marotta @mmatp (Inclusive-tech evangelist, 2017 ISTE Inclusive Learning Network Outstanding Educator, Raspberry Pi Certified Educator, co-moderator of the #ATchat weekly Twitter chat – New Jersey NJ, USA)
  • Rachel Berger @rachelmberger (Decoding Dyslexia Minnesota President, educational advocate for students with LD, accessibility & AT evangelist, Microsoft Learning Tools specialist, company founder of I Am Dyslexia  – Minneapolis, MN USA)
  • Shelley Ardis @Shelleypa (Technology Director at Florida School for the Deaf and the Blind; previously, a statewide consultant supporting schools serving Deaf students – St. Augustine, Florida, USA)
  • Tiffany Thompson @digischolars (Senior Instructional Technology Specialist, Microsoft Master Trainer, Accessibility, Diversity & Inclusion Consultant, Surface Expert, Online Workshop Facilitator – Brooklyn MD, USA)


Our hosts are thrilled for the upcoming TweetMeet. Each of them wants to invite you to the event in their own way.

Next month’s event: Microsoft Teams

The theme of next month’s Tweetmeet on June 18th will be Microsoft Teams. We’re looking forward to this event and hope you’ll spread the word!

What are #MSFTEduChat TweetMeets?

Every month Microsoft Education organizes social events on Twitter targeted at educators globally. The hashtag we use is #MSFTEduChat. A team of topic specialists and international MIE Expert teachers prepare and host these TweetMeets together. Our team of educator hosts first crafts several questions around a certain topic. Then, before the event, they share these questions on social media. Combined with a range of resources, a blog post and background information about the events, this allows all participants to prepare themselves to the full. Afterwards we make an archive available of the most notable tweets and resources shared during the event.

TweetChat expert Madalyn Sklar recently published this helpful introductory guide:
Your Complete Guide to Twitter Chats: Why You Should Join & How to Make the Most of It

Please connect with TweetMeet organizer Marjolein Hoekstra @OneNoteC on Twitter if you have any questions about TweetMeets or helping out as a host.


5 ways tech is changing how people with disabilities experience the world

For the past eight years, Microsoft has brought together people from different parts of the company at the Ability Summit. This year’s gathering is taking place May 30 to 31 at Microsoft’s headquarters in Redmond, Wash.

The summit, designed to empower all people – including the more than one billion people with disabilities  has been a place for accessibility innovation. I2015 it helped give way to the Xbox Adaptive Controller.   

Here’s a closer look at the controller and other recent technologies with inclusive design. 

Seeing AI 

Artificial intelligence is bringing descriptive detail to people in the form of an app, Microsoft’s Seeing AI 

Seeing AI is designed for people who are blind or with low vision. It augments the world around the user with audio descriptions. And it reads short bursts of text and scans product barcodes. Documents can be photographed and their content read back. Seeing AI also scans and reads handwritten notes

[Like this article? Subscribe to Microsoft on The Issues for more on the topics that matter most.]


[embedded content]

Microsoft Soundscape goes beyond immediate proximity to build a 3D sound map of the user’s world. 

It uses data and sound to add layers of information and context. In short, it helps users feel more comfortable when making their way around. Landmarks, road intersections and the places regularly visited can all be allocated a sound beacon so they can be clearly detected upon approach. 

Soundscape’s synthesized binaural audio adds realism to directions, taking the map on a user’s phone and, effectively, creating an audio version.    

Translation and captioning 

Douglas Adamss “The Hitchhiker’s Guide to the Galaxy described the Babel Fish. These clever creatures, once inserted into someone’s ear, would translate any language. 

Being able to participate in multilingual conversations is no longer the preserve of fiction. 

Microsoft Translator acts as a real-time translation hub sitting between people speaking different languages and translating on the fly. It can do this when multiple languages are being spoken at the same time. It can also be set English to English and provide real-time captioning for people who are deaf or hardofhearing. 

Gaming gets serious 

The global gaming market is a multibilliondollar industry.  To help make gaming more accessible, Microsoft introduced the Xbox Adaptive Controller, a product of the company’s Hackathon in 2015. This controller has large, programmable buttons and can be connected to a range of external devices. The combination of large and customizable switches, buttons, mounts and joysticks empower all gamers. 

Code you can hold in your hands 

[embedded content]

Microsoft Code Jumper is the physical manifestation of a programming language that’s helping children who are blind or with low vision learn to code. Code Jumper is made of a series of programmable, tactile plastic switches, or pods. Each pod is an instruction, and can be joined together to create a line of code. 

It means all children studying coding as part of their school curriculum can benefitChildren can learn about sequence, iteration, selection and variables. And they learn how to solve a problem by thinking algorithmically  breaking a process down into its constituent parts and looking for different routes to the best solution. 

For more on these innovations and accessibility initiatives at Microsoft, visit and follow @MSFTIssues on Twitter.  

VA and Microsoft partner to enhance care, rehabilitation and recreation for Veterans with limited mobility

Xbox Adaptive Controllers will be distributed across facilities within nation’s largest integrated health-care system

WASHINGTON — April 30, 2019 — Today, the U.S. Department of Veterans Affairs (VA) and Microsoft Corp. announced a new collaboration to enhance opportunities for education, recreation and therapy for Veterans with mobility limitations by introducing the Xbox Adaptive Controller a video game controller designed for people with limited mobility in select VA rehabilitation centers around the country.

The partnership, which was formalized April 18, will provide controllers and services to Veterans as part of therapeutic and rehabilitative activities aimed at challenging muscle activation and hand-eye coordination, and greater participation in social and recreational activities.

“This partnership is another step toward achieving VA’s strategic goals of providing excellent customer experiences and business transformation,” said VA Secretary Robert Wilkie. “VA remains committed to offering solutions for Veterans’ daily life challenges.”

Together, VA and Microsoft identified an opportunity to introduce or reintroduce gaming to Veterans with spinal cord injuries, amputations, and neurological or other injuries at 22 VA medical centers across the United States. Microsoft is donating its Xbox Adaptive Controller, game consoles, games and other adaptive gaming equipment as part of the collaboration.

Designated VA staff will engage with Veterans using the equipment and share feedback with Microsoft on therapeutic utility and the Veteran experience.

“We owe so much to the service and sacrifice of our Veterans, and as a company, we are committed to supporting them,” said Satya Nadella, CEO of Microsoft. “Our Xbox Adaptive Controller was designed to make gaming more accessible to millions of people worldwide, and we’re partnering with the U.S. Department of Veterans Affairs to bring the device to Veterans with limited mobility, connecting them to the games they love and the people they want to play with.”

Microsoft and VA have a long-standing strategic partnership, working together for more than 20 years to provide the best possible care and service to Veterans. Gaming is a popular pastime of military personnel, and access to the Xbox Adaptive Controller in VA rehabilitation centers provides the opportunity for Veterans to experience gaming’s various benefits, including staying connected with friends and family across the world, building esprit de corps through competitive or cooperative gameplay, and providing stress relief.

Microsoft’s initial contributions will be allocated across 22 VA facilities. In addition, the controllers and other equipment will be available for Veterans to use at events hosted by VA’s Office of National Veterans Sports Programs and Special Events, such as the National Veterans Wheelchair Games.

The following 16 centers have confirmed participation to date, with at least six additional centers to come: Augusta VA Medical Center (VAMC), Central Alabama VA Health Care System (HCS), Central Texas Veterans HCS, Chillicothe VAMC, Dayton VAMC, Memphis VAMC, Minneapolis VA HCS, Richmond VAMC, VA St. Louis HCS, South Texas Veterans HCS (Audie L. Murphy VA Hospital), South Texas Veterans HCS (Kerrville Division), James A Haley Veterans Hospital – Tampa, VA Eastern Colorado HCS, VA New York Harbor HCS, VA Palo Alto HCS, and VA Puget Sound HCS.

More information on partnering with VA can be found at

Media assets can be found here.

About the U.S. Department of Veterans Affairs Veterans Health Administration

The Veterans Health Administration (VHA) is the largest integrated health care system in the United States, providing care at 1,250 healthcare facilities, including 172 VA Medical Centers and 1,069 outpatient sites of care of varying complexity (VHA outpatient clinics) to over 9 million Veterans enrolled in the VA health care program.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Assembly for Xbox, (206) 223-1606,

U.S. Department of Veterans Affairs, Office of Public Affairs Media Relations, (202) 461-7600,

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at


New Garage project bakes accessibility into game development

At Game Developers Conference 2019, we shared an early peek at Responsive Spatial Audio for Immersive Gaming, a Microsoft Garage project. The Unity plug-in helps developers infuse accessibility into games by making it easy to annotate game objects with descriptive text and present it to players through interactive audio cues. The project is now available worldwide in the Unity Store.

Baking accessibility into game development

A number of hackers have joined the cause to make games more accessible. For example, Ear Hockey, a Microsoft Garage project, is a game designed around the blind and low vision community, and the Xbox Adaptive Controller, a Hackathon project turned Garage Wall of Famer, is a game controller designed for gamers of all abilities. The Garage project team members who developed Responsive Spatial Audio are taking a different approach, focusing on the game developer by baking accessibility right into an easy, drag-and-drop interaction toolkit.
With Responsive Spatial Audio, game developers can tag 3D objects with descriptive text, and the experience captures these tags and spatial coordinates to help players navigate. As players traverse through the game world and encounter tagged objects and designated points of interest, they are guided by audio cues via a built-in, text-to-speech API. An accessible FPS controller presents relevant descriptions at the right time by monitoring player movement, scanning their surroundings for metadata, and cuing spatial audio guidance for objects in the frame of view.
Responsive Spatial Audio_Screenshot_002

Key features to provide a more accessible experience

Responsive Spatial Audio offers a number of features that make prioritizing accessibility easy.

Accessible FPS Controller Convey object descriptions within the player’s frame of view via audio cues and adjust the viewing frustum length and arc

Annotate Game Objects Tag and manage objects with descriptive text—tag once and descriptions appear everywhere the object does

Vantage Point Objects Add and manage vantage points, or invisible doorframe-like points of interests that convey a whole view (as opposed to objects within the viewing frustum). Present different descriptions based on the direction the player is facing

Accessible Navigation Aid player navigation with a suite of interaction tools including:

Guide players to a selected object via a navigation agent with an orientation and spatial beacon
Add a script to guide players to nearby vantage points with auditory beeps
Enable bump noises with custom sounds, that will play spatial audio  upon collision, intelligently based on the orientation of the player
Change background audio based on the location of the player
Indicate the global north and south of the game with spatial sound
Inventory UI Leverage an optional in-box inventory UI to easily manage a library of game objects
To see how you can incorporate Responsive Spatial Audio into your games, see the project in action in a demo accompanying the plugin in the Unity Store.

One step closer to seamless, accessible development

We sat down with Brannon Zahand and Evelyn Thomas, each Senior Program Managers in Accessibility R&D who champion accessibility in the gaming space, to hear their reflections on the project. “The idea that I can drag and drop this into a game, with very little work to implement it, is a game changer for the industry” shared Brannon. Evelyn attended GDC 2019 to talk to developers about best practices in accessibility, highlighting the project at a conference talk and Microsoft’s accessibility booth.

“The idea that I can drag and drop this into a game, with very little work to implement it, is a game changer for the industry.”

Responsive Spatial Audio was developed by Manohar Swaminathan, a Senior Researcher in Microsoft Research, based in Bangalore, India. Manohar has been working in graphics for years, but found a passion for accessibility while working on CodeTalk, a solution that empowers developers in the blind and low vision community to do more with Visual Studio. He was searching for ways to do more impactful work in India when he met and teamed up with former Research Fellow Venkatesh Potluri, a blind developer who was interested in enhancing his productivity. After releasing CodeTalk, Manohar was inspired to combine his background with games and VR to make the gaming space more accessible through audio. “We thought ‘Can we use rich, spatial audio content to replace the visual information that is missing?” and decided to give it a shot,” he shared. It’s Manohar’s hope that plug-and-play tools will inspire developers to create fun and inclusive game experiences accessible to all.

Try It Out

Responsive Spatial Audio and a demo are now available worldwide in the Unity Store. The team looks forward to hearing feedback via UserVoice.

Advancing accessibility on the web, in virtual reality and in the classroom

The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.

At the ACM CHI Conference on Human Factors in Computing Systems conference in Glasgow, Scotland this May, researchers from Microsoft’s Redmond and UK labs, together with our university collaborators, will be presenting several papers and demos that explore how to design technologies more inclusively, to support accessibility by users with cognitive and/or sensory disabilities.

Microsoft researchers Adam Fourney, Kevin Larson, and myself teamed up with University of Washington researchers Qisheng Li and Katharina Reinecke to explore the accessibility of the Web to people with dyslexia. Dyslexia is a cognitive disability estimated to affect about 15% of English-speaking adults; people with dyslexia can experience varying degrees of difficulty with reading-related tasks. Because access to information on the Web is a key modern literacy skill, ensuring that online information is cognitively accessible is an important concern; beyond people with dyslexia, improving cognitive access to the Web may benefit other groups who experience reading challenges such as English language learners or children.

At CHI 2019, lead author and University of Washington graduate student Qisheng Li will present the Microsoft Research-UW team’s findings, summarized in their paper, “The Impact of Web Browser Reader Views on Reading Speed and User Experience.” The team explored whether the “reading mode” common in most modern browsers significantly impacted users’ reading speed and comprehension, and whether users with dyslexia specifically benefitted from this intervention. Using the “Lab in the Wild” infrastructure developed by Professor Reinecke, the team conducted an online study with 391 English-speaking adults (42 with dyslexia), in which participants read several popular webpages and answered associated reading-comprehension questions, some in the typical browser view and some in the reading mode.

A webpage in typical browser view (left), and in the reading mode (right).

A webpage in typical browser view (left), and in the reading mode (right).

As expected, people with dyslexia had substantially slower reading speeds than people without dyslexia; however, people with dyslexia did not seem to receive any differential benefit of the reading mode. Instead, the team found that reader view overall enhanced reading speed of all users by about 5%, as compared to the default website view. However, the study found that reader mode buttons are disabled by default, and that the rules governing the availability of reader mode are opaque to web developers. Only 41% of 1100 popular webpages sampled successfully enabled reader view. Our findings suggest that web page designers should develop their page in a way that enables the reader mode button in major browsers, so that users can have the option to reap this reading-speed benefit. Making it easier for web developers to intentionally enable the reading mode option as well as exploring which particular aspects of the reader view transformations provide the most benefit are key areas for future work.

Accessibility beyond the traditional desktop computing experience is also a focus of Microsoft Research’s contributions to CHI this year. Intern Yuhang Zhao, a graduate student at Cornell Tech, will present a paper summarizing joint research with Microsoft researchers Ed Cutrell, Christian Holz, Eyal Ofek, myself, and Andrew Wilson that explores how to enhance the accessibility of emerging virtual reality (VR) technologies: “SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision.” The team will also present a live demo during the conference’s demonstration session and at the Microsoft booth; blog readers can experience a demo by viewing the project’s online video figure.

Low vision (that is, visual disabilities that cannot be fully corrected by glasses) impacts 217 million people worldwide according to the World Health Organization. While desktop software offers some accommodation features for people with low vision (for example, screen magnifiers), VR systems have not yet grappled with the issue of accessibility for this audience. Indeed, when interviewing VR developers, the team found that none had received training or guidance on how to develop accessible VR experiences.

Because low vision encompasses a range of visual abilities (for example, tunnel vision, blind spots, brightness sensitivity, low visual acuity, and so on), the team took a toolkit approach—they developed SeeingVR, a set of 14 tools for Unity developers (Unity is one of the most widely-used VR development platforms). End-users can activate different combinations of these tools depending on their abilities and the context of the current application and task. Example tools include a magnifier and bifocal views, brightness and contrast adjustment for the scene, edge-enhancement to make virtual objects more salient from their backgrounds, depth measurement tools, and the ability to point at text or objects in a virtual scene to have them read or described aloud. The majority of these tools can be applied to existing Unity applications post-hoc, to support easy adoption.

The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.

The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.

Evaluation with 11 people with low vision completing a variety of tasks in VR (for example, menu selection, grasping objects, shooting moving targets) found that all participants could complete tasks more quickly and accurately when using SeeingVR tools as compared to the default VR experience. All participants chose different combinations of the available tools, reinforcing the value of allowing flexibility and customization of low vision accessibility options.

Microsoft Research researchers are also exploring non-visual representations of VR for people who are completely blind. Microsoft Soundscape is a smartphone application that uses spatial audio to deliver a rich, non-visual navigation experience. At the CHI 2019 workshop on “Hacking Blind Navigation” (co-organized by Principal Researcher Ed Cutrell), Microsoft Research intern and University of Washington student Anne Spencer Ross will present research on how to craft an audio-only VR experience that can allow people to rehearse a walking route virtually before experiencing the route in the physical world via Soundscape. Her paper, “Use Cases and Impact of Audio-Based Virtual Exploration” is a collaboration between engineers from the Soundscape team (Melanie Kneisel and Alex Fiannaca) and researchers in the Microsoft Research Redmond Lab (Ed Cutrell and myself). Melanie Kneisel will also be a featured speaker at the workshop.

In addition to presenting research on accessible Web browsing and accessible VR, researchers from Microsoft’s Cambridge, UK lab will be sharing a tangible toolkit to enhance the accessibility of computer science education for children who are blind. Led by researcher Cecily Morrison, the CodeJumper project is a physical programming language for teaching children ages 7 – 11 basic programming concepts and computational thinking regardless of level of vision. It was inspired by the need to provide a way for young blind and low vision students to access the computing curriculum inclusively alongside their sighted peers. Children plug together pods that represent lines of code in a program to create programs that when run make music, stories, or poetry. Children can start with very simple concepts—such as, a program is a sequence of commands—and progress to complicated program flows that utilise variables, covering the whole of the curriculum for this age band. It was successfully tested with 75 children and 30 teachers across the United Kingdom and found to support age-appropriate learning of coding as well as encouraging whole-child learning, such as creating friendships with sighted peers. The tangible CodeJumper kit will be available for CHI participants to experience during the conference’s demo session.

We look forward to seeing you at CHI 2019 in Glasgow and sharing ideas and advancing the accessibility conversation together.


Sharing our learnings on World Autism Awareness Day

By Neil Barnett, Director of Inclusive Hiring and Accessibility at Microsoft

Autism @ Work Playbook, a guide for HR professionals and organizations to create employment opportunities for individuals on the autism spectrum.

Autism @ Work Playbook, a guide for HR professionals and organizations to create employment opportunities for individuals on the autism spectrum.

Being inclusive is not something we simply do, but rather it stands for who we are. The value proposition for diversity and inclusion within Microsoft is increasingly clear — a diverse and inclusive workforce will yield better products and solutions for our customers, and better experiences for our employees. We all need to do our part to encourage new and different perspectives, solutions, and innovative ideas to surface from all our employees.

Four years ago, on World Autism Awareness Day, we announced Microsoft’s Autism Hiring Program. Given that 80% of individuals on the autism spectrum are unemployed or underemployed, we knew there was an untapped pool of talented people who have the skills aligned to the work we are doing every day at Microsoft. Over the years, we have learned a lot about inclusive hiring and actively share our learnings with other employers interested in evolving their hiring programs.

As an outcome of working with organizations across industries, we are proud to be a founding member of the Autism @ Work Employer Roundtable. About 18 months ago, Disability:IN brought together a group of employers to create the Roundtable, where the main goal is to expand employment opportunities for individuals on the autism spectrum. Today, the Roundtable consists of 16 organizations who all have robust autism hiring programs. As part of the Roundtable, we regularly receive questions about how to get started. In response, we came together and collaborated with the University Washington Information School to publish the “Autism @ Work Playbook,” a resource for employers who are interested in beginning or expanding their inclusive hiring journey. The playbook covers topics from building your business case, recruiting and sourcing talent, going through the interview process, to training, on-boarding, and career development. This is a great resource for every organization, so please take a moment to read and share the playbook.

Throughout the year, Roundtable members along with 250 other employers, academics, service providers, and government agencies, get together at events across the U.S. to share best practices and talk about how we can work together to make an impact on changing the unemployment rate for individuals on the autism spectrum. This year we are happy to announce there will be two upcoming summits sponsored by the Autism @ Work Employer Roundtable to bring this group together. The Autism at Work Summit in May will be hosted by Microsoft at our Redmond campus followed by another summit hosted by SAP in the Fall.

Over the years, we have continued to learn, evolve, and grow our own Autism Hiring Program. We changed our approach from a traditional interview to a cohort model where candidates come to our campus for a week-long experience to demonstrate their skills, gain feedback, and meet with hiring managers. Based on interest in the program from across the company, we have also seen an expansion of the types of roles where candidates are being hired. For example, we have grown from hiring mainly technical roles, like software engineers or data scientists, to more non-technical roles like customer support. In addition, we have recently expanded the program outside of Redmond, WA, with new pilot programs at our campuses in Fargo, ND and Vancouver, BC. These are important regions for Microsoft – we have a large campus in Fargo and it’s one of our TechSpark regions, and an ongoing vision for greater cross-border collaboration throughout the Cascadia Innovation Corridor in Vancouver. We look forward to seeing the program grow in these regions and continue to expand.

Our approach to inclusive hiring is not limited to just one event, program, or initiative. We host various career fairs throughout the year, and on April 23, we will host our next Virtual Career Fair in partnership with 14 other leading companies including, Fidelity, Deloitte, Travelers, Ford and more. If you know someone that is looking for a new opportunity, please make sure to share the event, which provides opportunities to connect with recruiters, learn about open positions and submit resumes. We also fund an annual scholarship for high school students interested in technology and offer broad inclusive hiring efforts and events. If we bring organizations together to focus on inclusive hiring, we can accelerate progress towards breaking down barriers and changing the unemployment rate for individuals on the autism spectrum.

We look forward to continuing to partner with others in the near future to truly make a difference in the employment landscape, educational, and working opportunities for all individuals on the spectrum. If you or someone you know is interested in a career at Microsoft, please send resumes to and learn more about our inclusive hiring efforts at We look forward to hearing from you!

Autism at Work Virtual Career Fair featuring companies, Cintas, Deloitte, EY, Fidelity, Ford, IBM, JP Morgan Chase &Co., Merck, Microsoft, PwC, SAP, Travelers, Ultra Testing, and Willis Towers Watson.

Autism at Work Virtual Career Fair featuring companies, Cintas, Deloitte, EY, Fidelity, Ford, IBM, JP Morgan Chase &Co., Merck, Microsoft, PwC, SAP, Travelers, Ultra Testing, and Willis Towers Watson.