Posted on Leave a comment

Noise-cancelling headphones, smart glasses: how technology is making museums more accessible

Museums are places for people to immerse themselves in culture, as well as learn, create, share and interact.

Being accessible — designed for everyone — is one way museums can maximize that role, and a growing number are working hard to do just that to serve the more than  one billion people worldwide experience some form of disability.

Here is how technology is helping museums get closer to the communities they serve.

Noise-cancelling headphones

We don’t all experience the world in the same way — everyone is different. People with autism, for example, may find certain situations cause a sensory overload.

New York’s Intrepid Sea, Air & Space Museum offers noise-cancelling headphones for people who might have auditory over-stimulation. This museum also helps parents of children with sensory processing disabilities plan their visits by emailing them images and illustrations in advance.

Museums in Chicago (including the Shedd Aquarium, the Field Museum and the Chicago Children’s Museum) also help visitors plan their trips through an app that highlights exhibitions that are sensory friendly.

[Subscribe to Microsoft On The Issues for more on the topics that matter most.]

Audio descriptions

Statue and El Prado Museum

Tactile displays and audio descriptions can help bring museum experiences to life.

The Smithsonian museums in Washington, D.C., are giving visitors who are blind or with low vision a rich and rewarding experience through their smartphones or smart glasses. Using a video-streaming service, users are connected to an “agent” who provides a bespoke, detailed description of their surroundings.

The use of Braille descriptions has become increasingly common in museums around the world, and one Spanish institution has improved upon that. Madrid’s Prado Museum has made parts of its collection tactile, allowing visitors to be hands-on with the exhibitions.

The Louvre in Paris, and the Guggenheim Museum and the Metropolitan Museum of Art in New York, have all established tactile tours, where visitors can touch the art on display or touch casts of well-known works.

Hearing loops

Field Museum of Natural History

Tools such as hearing loops — also known as audio induction loops — use wireless signals to transmit audio directly to someone’s hearing aid and can be used in a variety of settings, including museum exhibitions. The Met in New York is just one example of this.

Another New York museum, the Whitney Museum of American Art, has been trying something different. It has developed a series of vlogs, or video blogs, with messages, explanations and exhibition information in sign language.

As well as opening up the museum’s content to visitors with hearing loss and deafness, the museum, on its website, says it hopes to “create a communications laboratory to expand the ASL vocabulary of contemporary art terms,” referring to American Sign Language.

The Dutch Rijksmuseum believes everyone should be able to access information on the art in their own language. It recently launched a video tour in Dutch Sign Language integrated in its app. The tour has been set up in close collaboration with and by deaf entrepreneurs.

Immersive experiences

Rocket Garden at Kennedy Space Center

A few years ago, the Pokémon Go craze took off, introducing many people to the possibilities of augmented reality. By creating immersive experiences, AR and other technology is being used to reimagine the way visitors relate to museums and historic sites.

You can take an AR tour of Pompeii, where a headset will put you right in the heart of the vibrant Roman city that was destroyed by the catastrophic eruption of Mount Vesuvius in A.D. 79.

Visitors to Bone Hall, in the Smithsonian National Museum of Natural History in Washington, D.C., meanwhile, can use AR to view the exhibits in a new light seeing the skeletons appear as living creatures.

The Petersen Automotive Museum in Los Angeles is using technology to bring cars from Hollywood alive with a mixed reality exhibition using Microsoft’s HoloLens technology. The “Worlds Reimagined” experience explores classic and futuristic cars from films and video games, including “Back to the Future” and the video game franchise “Halo.”

Other museums are using this technology to bring new experiences to their patrons including the Intrepid Sea, Air and Space Museum in New York with “Defying Gravity”; and the Museum of Flight’s mobile VR experiences in Washington state. The Musée des Plans-Reliefs in Paris used AI to create a digital twin of the historic Mont-Saint- Michel, which had to be captured from every angle.

In 1962, President John F. Kennedy captured the Space Race zeitgeist, when he said: “We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard.” The Kennedy Space Center in Florida uses immersive technologies to recapture that energy, excitement and enthusiasm. At its “Heroes & Legends” exhibition, visitors can experience spacewalks, look inside space capsules and feel close to the action.

By bringing the past to life in a way that adds richness and depth, and, of course, accessibility, technology is helping museums reach a wider audience.

For more on these innovations and on accessibility initiatives at Microsoft, visit and follow @MSFTIssues

Posted on Leave a comment

Inclusive classrooms and accessibility—join the global #MSFTEduChat TweetMeet on May 21

Announcing the May 21 TweetMeet on ‘Inclusive classrooms and accessibility.’

Change starts with awareness. Every third Tuesday of May is Global Accessibility Awareness Day (GAAD). The purpose of GAAD is to get everyone talking, thinking and learning about people with different abilities and their accessibility and inclusion in a digital world.

Our mission is to empower every student on the planet to achieve more, which stems from the belief that every student deserves the opportunity to fulfill their potential.

In this edition of our monthly global and multilingual Twitter conversations, we’ll discuss ways in which educators around the world make inclusion and accessibility an integrated part of their classrooms.

Keep reading for detailed information about this TweetMeet.

Language tracks and SuperSway

We offer seven simultaneous language tracks this month: English, French, Spanish, Italian, Polish, Swedish and Vietnamese. The new SuperSway offers a TweetMeet Invitation in each of these languages.

For each language track, we have one or more hosts to post the translated questions and respond to educators. As always, we’re super grateful to all current and former hosts who are collaborating closely to provide this service.

The #TweetMeetXX hashtags for non-English languages are to be used together with #MSFTEduChat so that everyone can find the conversations back in their own language. For example: Spanish-speaking people should use both #TweetMeetES #MSFTEduChat. English-speaking educators may use #MSFTEduChat on its own.

TweetMeet Fan? Show it off on your Twitter profile!

Every month more and more people discover the unique flow and characteristics of the TweetMeet events and become excited to participate.

Show your passion for the TweetMeets right from your own Twitter page by uploading this month’s #MSFTEduChat Twitter Header Photo to the top of your own Twitter profile.

In the same file folder, the Twitter Header Photo is available in many other languages and time zones.

Looking back on the April TweetMeet on ‘Teaching Happiness’

The April #MSFTEduChat TweetMeet inspired educators around the world to share ideas, insights and resources. We captured highlights from this Twitter conversation in this @MicrosoftEDU Twitter Moment.

Why join the #MSFTEduChat TweetMeets?

TweetMeets are monthly recurring Twitter conversations about themes relevant to educators, facilitated by Microsoft Education. The purpose of these events is to help professionals in education to learn from each other and inspire their students while they are preparing for their future. The TweetMeets also nurture personal learning networks among educators from across the globe.

We’re grateful to have a support group made up exclusively of former TweetMeet hosts, who volunteer to translate communication and check the quality of our questions and promotional materials. They also help identify the best candidates for future events, provide relevant resources, promote the events among their networks and, in general, cheer everybody on.

When and how can I join?

Join us Tuesday, May 21 from 10 a.m. to 11 a.m. PDT on Twitter using the hashtags #MSFTEduChat, #inclusion, #accessibility and #MicrosoftEDU (which you can always use to stay in touch with us). Be sure to double-check your own local event time. You can find the event time for 215 countries with this time zone announcer.

Our next recommendation for you is to set up Twitter dashboard TweetDeck and add a column for the hashtag #MSFTEduChat. If you are new to TweetDeck, then check out this brief TweetDeck video tutorial by Marjolein Hoekstra.

When a tweet appears that you want to respond to, press the retweet button and type your comments. Great news is that Twitter now supports adding images, animated GIFs and videos to your comment retweets.

Additional tips are offered in this animated GIF that you’re most welcome to share with newcomers:

Too busy to join at event time? No problem!

From our monthly surveys we know that you may be in class at event time, busy doing other things or may even be asleep–well, no problem! All educators are welcome to join any time after the event. Simply look at the questions below and respond to these at a day and time that suit you best.

You can also schedule your tweets in advance. In that case, be sure to include the entire question in your tweet and mention the hashtag #MSFTEduChat so that everyone knows to which question in which conversation you are responding.

The exact question timings are in this helpful graphic:

Resources to help prepare for the TweetMeet

Microsoft Education offers a wide range of tools, professional-development courses and learning paths about inclusion and accessibility. These resources are tailored for educators and they are all free. Good places to start are:

Microsoft Inclusive Ultimate portal

Microsoft Accessibility overview in Sway format, live-embedded:

Wakelet is a useful web service to bookmark, curate and annotate resources, images, tweets and other content. Mike Tholfsen just created this Wakelet Collection as a handy reference. It currently has 40+ pointers:

Inclusive Classrooms Wakelet Collection, live-embedded:

Discussion Questions

A great way to prepare for the TweetMeet is by taking a close look at the discussion questions. Watch the animated GIF with all the questions:


Meet the 14 hosts for this month’s TweetMeet! They are all passionate about #inclusion and #accessibility and very eager to engage with you.

Check out all the hosts, see what they are tweeting about and consider following them:

List of host names and their profiles

  • Catherine Dourmousi  @CatDourmousi (EFL teacher, Hellenic American Union examiner for Michigan University English-language exams, Microsoft Certified Educator, author, supporter of empathy, mindfulness, and growth mindset in teaching Athens, Greece)
  • Elisabetta Nanni @Bettananni (Music teacher and teacher trainer about ICT, MIE Expert, eTwinning Ambassador with expertise in Microsoft Learning Tools Trento, Italy)
  • Elsbeth Seymour @TeachinEls (MIE Expert, Secondary Special Ed Teacher, using a passion for tech & gaming to connect, support and facilitate learning for neurodivergent students – California, USA)
  • Fabrice Marrou @FabMarrou (French and History teacher in a vocational school, former Microsoft Learning Consultant, ICT trainer – Perpignan, France)
  • Huong Quynh @Quynhth9 (EFL teacher, teacher trainer, passion for exploring ICT in language education – Hanoi, Vietnam)
  • Iwona Cugier @icugier (Teacher and trainer with a focus on ICT and programming, passionate about digitalization in education – Leszno, Poland)
  • Joe Brazier @ManvDadHood (Former Special Educator and EdTech Integration Trainer, Business Strategy Lead at Microsoft focused on the K12 Modern Classroom Experience and Inclusive Education – Kirkland WA, USA)
  • José Carlos Sancho @72Joseca (History teacher, MIE Expert, teacher trainer, ICT coordinator at FEC (Fundación Educación Católica) and passionate about Microsoft Learning Tools – Zaragoza, Spain)
  • Kelli Suding @ksuding (Indiana statewide PATINS specialist of autism, Specific Learning Disabilities (SLD), Chrome accessibility, Accessible Educational Materials (AEM) & assistive technology integration – Indianapolis IN, USA)
  • Martin Howe @Martin_Howe (Teacher with passion for helping students with special needs to develop and reach their goals, preferably using digital tools – Borlänge, Sweden)
  • Mike Marotta @mmatp (Inclusive-tech evangelist, 2017 ISTE Inclusive Learning Network Outstanding Educator, Raspberry Pi Certified Educator, co-moderator of the #ATchat weekly Twitter chat – New Jersey NJ, USA)
  • Rachel Berger @rachelmberger (Decoding Dyslexia Minnesota President, educational advocate for students with LD, accessibility & AT evangelist, Microsoft Learning Tools specialist, company founder of I Am Dyslexia  – Minneapolis, MN USA)
  • Shelley Ardis @Shelleypa (Technology Director at Florida School for the Deaf and the Blind; previously, a statewide consultant supporting schools serving Deaf students – St. Augustine, Florida, USA)
  • Tiffany Thompson @digischolars (Senior Instructional Technology Specialist, Microsoft Master Trainer, Accessibility, Diversity & Inclusion Consultant, Surface Expert, Online Workshop Facilitator – Brooklyn MD, USA)


Our hosts are thrilled for the upcoming TweetMeet. Each of them wants to invite you to the event in their own way.

Next month’s event: Microsoft Teams

The theme of next month’s Tweetmeet on June 18th will be Microsoft Teams. We’re looking forward to this event and hope you’ll spread the word!

What are #MSFTEduChat TweetMeets?

Every month Microsoft Education organizes social events on Twitter targeted at educators globally. The hashtag we use is #MSFTEduChat. A team of topic specialists and international MIE Expert teachers prepare and host these TweetMeets together. Our team of educator hosts first crafts several questions around a certain topic. Then, before the event, they share these questions on social media. Combined with a range of resources, a blog post and background information about the events, this allows all participants to prepare themselves to the full. Afterwards we make an archive available of the most notable tweets and resources shared during the event.

TweetChat expert Madalyn Sklar recently published this helpful introductory guide:
Your Complete Guide to Twitter Chats: Why You Should Join & How to Make the Most of It

Please connect with TweetMeet organizer Marjolein Hoekstra @OneNoteC on Twitter if you have any questions about TweetMeets or helping out as a host.

Posted on Leave a comment

5 ways tech is changing how people with disabilities experience the world

For the past eight years, Microsoft has brought together people from different parts of the company at the Ability Summit. This year’s gathering is taking place May 30 to 31 at Microsoft’s headquarters in Redmond, Wash.

The summit, designed to empower all people – including the more than one billion people with disabilities  has been a place for accessibility innovation. I2015 it helped give way to the Xbox Adaptive Controller.   

Here’s a closer look at the controller and other recent technologies with inclusive design. 

Seeing AI 

Artificial intelligence is bringing descriptive detail to people in the form of an app, Microsoft’s Seeing AI 

Seeing AI is designed for people who are blind or with low vision. It augments the world around the user with audio descriptions. And it reads short bursts of text and scans product barcodes. Documents can be photographed and their content read back. Seeing AI also scans and reads handwritten notes

[Like this article? Subscribe to Microsoft on The Issues for more on the topics that matter most.]


[embedded content]

Microsoft Soundscape goes beyond immediate proximity to build a 3D sound map of the user’s world. 

It uses data and sound to add layers of information and context. In short, it helps users feel more comfortable when making their way around. Landmarks, road intersections and the places regularly visited can all be allocated a sound beacon so they can be clearly detected upon approach. 

Soundscape’s synthesized binaural audio adds realism to directions, taking the map on a user’s phone and, effectively, creating an audio version.    

Translation and captioning 

Douglas Adamss “The Hitchhiker’s Guide to the Galaxy described the Babel Fish. These clever creatures, once inserted into someone’s ear, would translate any language. 

Being able to participate in multilingual conversations is no longer the preserve of fiction. 

Microsoft Translator acts as a real-time translation hub sitting between people speaking different languages and translating on the fly. It can do this when multiple languages are being spoken at the same time. It can also be set English to English and provide real-time captioning for people who are deaf or hardofhearing. 

Gaming gets serious 

The global gaming market is a multibilliondollar industry.  To help make gaming more accessible, Microsoft introduced the Xbox Adaptive Controller, a product of the company’s Hackathon in 2015. This controller has large, programmable buttons and can be connected to a range of external devices. The combination of large and customizable switches, buttons, mounts and joysticks empower all gamers. 

Code you can hold in your hands 

[embedded content]

Microsoft Code Jumper is the physical manifestation of a programming language that’s helping children who are blind or with low vision learn to code. Code Jumper is made of a series of programmable, tactile plastic switches, or pods. Each pod is an instruction, and can be joined together to create a line of code. 

It means all children studying coding as part of their school curriculum can benefitChildren can learn about sequence, iteration, selection and variables. And they learn how to solve a problem by thinking algorithmically  breaking a process down into its constituent parts and looking for different routes to the best solution. 

For more on these innovations and accessibility initiatives at Microsoft, visit and follow @MSFTIssues on Twitter.  

Posted on Leave a comment

VA and Microsoft partner to enhance care, rehabilitation and recreation for Veterans with limited mobility

Xbox Adaptive Controllers will be distributed across facilities within nation’s largest integrated health-care system

WASHINGTON — April 30, 2019 — Today, the U.S. Department of Veterans Affairs (VA) and Microsoft Corp. announced a new collaboration to enhance opportunities for education, recreation and therapy for Veterans with mobility limitations by introducing the Xbox Adaptive Controller a video game controller designed for people with limited mobility in select VA rehabilitation centers around the country.

The partnership, which was formalized April 18, will provide controllers and services to Veterans as part of therapeutic and rehabilitative activities aimed at challenging muscle activation and hand-eye coordination, and greater participation in social and recreational activities.

“This partnership is another step toward achieving VA’s strategic goals of providing excellent customer experiences and business transformation,” said VA Secretary Robert Wilkie. “VA remains committed to offering solutions for Veterans’ daily life challenges.”

Together, VA and Microsoft identified an opportunity to introduce or reintroduce gaming to Veterans with spinal cord injuries, amputations, and neurological or other injuries at 22 VA medical centers across the United States. Microsoft is donating its Xbox Adaptive Controller, game consoles, games and other adaptive gaming equipment as part of the collaboration.

Designated VA staff will engage with Veterans using the equipment and share feedback with Microsoft on therapeutic utility and the Veteran experience.

“We owe so much to the service and sacrifice of our Veterans, and as a company, we are committed to supporting them,” said Satya Nadella, CEO of Microsoft. “Our Xbox Adaptive Controller was designed to make gaming more accessible to millions of people worldwide, and we’re partnering with the U.S. Department of Veterans Affairs to bring the device to Veterans with limited mobility, connecting them to the games they love and the people they want to play with.”

Microsoft and VA have a long-standing strategic partnership, working together for more than 20 years to provide the best possible care and service to Veterans. Gaming is a popular pastime of military personnel, and access to the Xbox Adaptive Controller in VA rehabilitation centers provides the opportunity for Veterans to experience gaming’s various benefits, including staying connected with friends and family across the world, building esprit de corps through competitive or cooperative gameplay, and providing stress relief.

Microsoft’s initial contributions will be allocated across 22 VA facilities. In addition, the controllers and other equipment will be available for Veterans to use at events hosted by VA’s Office of National Veterans Sports Programs and Special Events, such as the National Veterans Wheelchair Games.

The following 16 centers have confirmed participation to date, with at least six additional centers to come: Augusta VA Medical Center (VAMC), Central Alabama VA Health Care System (HCS), Central Texas Veterans HCS, Chillicothe VAMC, Dayton VAMC, Memphis VAMC, Minneapolis VA HCS, Richmond VAMC, VA St. Louis HCS, South Texas Veterans HCS (Audie L. Murphy VA Hospital), South Texas Veterans HCS (Kerrville Division), James A Haley Veterans Hospital – Tampa, VA Eastern Colorado HCS, VA New York Harbor HCS, VA Palo Alto HCS, and VA Puget Sound HCS.

More information on partnering with VA can be found at

Media assets can be found here.

About the U.S. Department of Veterans Affairs Veterans Health Administration

The Veterans Health Administration (VHA) is the largest integrated health care system in the United States, providing care at 1,250 healthcare facilities, including 172 VA Medical Centers and 1,069 outpatient sites of care of varying complexity (VHA outpatient clinics) to over 9 million Veterans enrolled in the VA health care program.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Assembly for Xbox, (206) 223-1606,

U.S. Department of Veterans Affairs, Office of Public Affairs Media Relations, (202) 461-7600,

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at

Posted on Leave a comment

New Garage project bakes accessibility into game development

At Game Developers Conference 2019, we shared an early peek at Responsive Spatial Audio for Immersive Gaming, a Microsoft Garage project. The Unity plug-in helps developers infuse accessibility into games by making it easy to annotate game objects with descriptive text and present it to players through interactive audio cues. The project is now available worldwide in the Unity Store.

Baking accessibility into game development

A number of hackers have joined the cause to make games more accessible. For example, Ear Hockey, a Microsoft Garage project, is a game designed around the blind and low vision community, and the Xbox Adaptive Controller, a Hackathon project turned Garage Wall of Famer, is a game controller designed for gamers of all abilities. The Garage project team members who developed Responsive Spatial Audio are taking a different approach, focusing on the game developer by baking accessibility right into an easy, drag-and-drop interaction toolkit.
With Responsive Spatial Audio, game developers can tag 3D objects with descriptive text, and the experience captures these tags and spatial coordinates to help players navigate. As players traverse through the game world and encounter tagged objects and designated points of interest, they are guided by audio cues via a built-in, text-to-speech API. An accessible FPS controller presents relevant descriptions at the right time by monitoring player movement, scanning their surroundings for metadata, and cuing spatial audio guidance for objects in the frame of view.
Responsive Spatial Audio_Screenshot_002

Key features to provide a more accessible experience

Responsive Spatial Audio offers a number of features that make prioritizing accessibility easy.

Accessible FPS Controller Convey object descriptions within the player’s frame of view via audio cues and adjust the viewing frustum length and arc

Annotate Game Objects Tag and manage objects with descriptive text—tag once and descriptions appear everywhere the object does

Vantage Point Objects Add and manage vantage points, or invisible doorframe-like points of interests that convey a whole view (as opposed to objects within the viewing frustum). Present different descriptions based on the direction the player is facing

Accessible Navigation Aid player navigation with a suite of interaction tools including:

Guide players to a selected object via a navigation agent with an orientation and spatial beacon
Add a script to guide players to nearby vantage points with auditory beeps
Enable bump noises with custom sounds, that will play spatial audio  upon collision, intelligently based on the orientation of the player
Change background audio based on the location of the player
Indicate the global north and south of the game with spatial sound
Inventory UI Leverage an optional in-box inventory UI to easily manage a library of game objects
To see how you can incorporate Responsive Spatial Audio into your games, see the project in action in a demo accompanying the plugin in the Unity Store.

One step closer to seamless, accessible development

We sat down with Brannon Zahand and Evelyn Thomas, each Senior Program Managers in Accessibility R&D who champion accessibility in the gaming space, to hear their reflections on the project. “The idea that I can drag and drop this into a game, with very little work to implement it, is a game changer for the industry” shared Brannon. Evelyn attended GDC 2019 to talk to developers about best practices in accessibility, highlighting the project at a conference talk and Microsoft’s accessibility booth.

“The idea that I can drag and drop this into a game, with very little work to implement it, is a game changer for the industry.”

Responsive Spatial Audio was developed by Manohar Swaminathan, a Senior Researcher in Microsoft Research, based in Bangalore, India. Manohar has been working in graphics for years, but found a passion for accessibility while working on CodeTalk, a solution that empowers developers in the blind and low vision community to do more with Visual Studio. He was searching for ways to do more impactful work in India when he met and teamed up with former Research Fellow Venkatesh Potluri, a blind developer who was interested in enhancing his productivity. After releasing CodeTalk, Manohar was inspired to combine his background with games and VR to make the gaming space more accessible through audio. “We thought ‘Can we use rich, spatial audio content to replace the visual information that is missing?” and decided to give it a shot,” he shared. It’s Manohar’s hope that plug-and-play tools will inspire developers to create fun and inclusive game experiences accessible to all.

Try It Out

Responsive Spatial Audio and a demo are now available worldwide in the Unity Store. The team looks forward to hearing feedback via UserVoice.
Posted on Leave a comment

Advancing accessibility on the web, in virtual reality and in the classroom

The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.

At the ACM CHI Conference on Human Factors in Computing Systems conference in Glasgow, Scotland this May, researchers from Microsoft’s Redmond and UK labs, together with our university collaborators, will be presenting several papers and demos that explore how to design technologies more inclusively, to support accessibility by users with cognitive and/or sensory disabilities.

Microsoft researchers Adam Fourney, Kevin Larson, and myself teamed up with University of Washington researchers Qisheng Li and Katharina Reinecke to explore the accessibility of the Web to people with dyslexia. Dyslexia is a cognitive disability estimated to affect about 15% of English-speaking adults; people with dyslexia can experience varying degrees of difficulty with reading-related tasks. Because access to information on the Web is a key modern literacy skill, ensuring that online information is cognitively accessible is an important concern; beyond people with dyslexia, improving cognitive access to the Web may benefit other groups who experience reading challenges such as English language learners or children.

At CHI 2019, lead author and University of Washington graduate student Qisheng Li will present the Microsoft Research-UW team’s findings, summarized in their paper, “The Impact of Web Browser Reader Views on Reading Speed and User Experience.” The team explored whether the “reading mode” common in most modern browsers significantly impacted users’ reading speed and comprehension, and whether users with dyslexia specifically benefitted from this intervention. Using the “Lab in the Wild” infrastructure developed by Professor Reinecke, the team conducted an online study with 391 English-speaking adults (42 with dyslexia), in which participants read several popular webpages and answered associated reading-comprehension questions, some in the typical browser view and some in the reading mode.

A webpage in typical browser view (left), and in the reading mode (right).

A webpage in typical browser view (left), and in the reading mode (right).

As expected, people with dyslexia had substantially slower reading speeds than people without dyslexia; however, people with dyslexia did not seem to receive any differential benefit of the reading mode. Instead, the team found that reader view overall enhanced reading speed of all users by about 5%, as compared to the default website view. However, the study found that reader mode buttons are disabled by default, and that the rules governing the availability of reader mode are opaque to web developers. Only 41% of 1100 popular webpages sampled successfully enabled reader view. Our findings suggest that web page designers should develop their page in a way that enables the reader mode button in major browsers, so that users can have the option to reap this reading-speed benefit. Making it easier for web developers to intentionally enable the reading mode option as well as exploring which particular aspects of the reader view transformations provide the most benefit are key areas for future work.

Accessibility beyond the traditional desktop computing experience is also a focus of Microsoft Research’s contributions to CHI this year. Intern Yuhang Zhao, a graduate student at Cornell Tech, will present a paper summarizing joint research with Microsoft researchers Ed Cutrell, Christian Holz, Eyal Ofek, myself, and Andrew Wilson that explores how to enhance the accessibility of emerging virtual reality (VR) technologies: “SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision.” The team will also present a live demo during the conference’s demonstration session and at the Microsoft booth; blog readers can experience a demo by viewing the project’s online video figure.

Low vision (that is, visual disabilities that cannot be fully corrected by glasses) impacts 217 million people worldwide according to the World Health Organization. While desktop software offers some accommodation features for people with low vision (for example, screen magnifiers), VR systems have not yet grappled with the issue of accessibility for this audience. Indeed, when interviewing VR developers, the team found that none had received training or guidance on how to develop accessible VR experiences.

Because low vision encompasses a range of visual abilities (for example, tunnel vision, blind spots, brightness sensitivity, low visual acuity, and so on), the team took a toolkit approach—they developed SeeingVR, a set of 14 tools for Unity developers (Unity is one of the most widely-used VR development platforms). End-users can activate different combinations of these tools depending on their abilities and the context of the current application and task. Example tools include a magnifier and bifocal views, brightness and contrast adjustment for the scene, edge-enhancement to make virtual objects more salient from their backgrounds, depth measurement tools, and the ability to point at text or objects in a virtual scene to have them read or described aloud. The majority of these tools can be applied to existing Unity applications post-hoc, to support easy adoption.

The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.

The 14 SeeingVR tools, overlaid individually upon a scene from the open source Unity game EscapeVR-HarryPotter; end-users can combine the individual tools as needed, based on their visual abilities.

Evaluation with 11 people with low vision completing a variety of tasks in VR (for example, menu selection, grasping objects, shooting moving targets) found that all participants could complete tasks more quickly and accurately when using SeeingVR tools as compared to the default VR experience. All participants chose different combinations of the available tools, reinforcing the value of allowing flexibility and customization of low vision accessibility options.

Microsoft Research researchers are also exploring non-visual representations of VR for people who are completely blind. Microsoft Soundscape is a smartphone application that uses spatial audio to deliver a rich, non-visual navigation experience. At the CHI 2019 workshop on “Hacking Blind Navigation” (co-organized by Principal Researcher Ed Cutrell), Microsoft Research intern and University of Washington student Anne Spencer Ross will present research on how to craft an audio-only VR experience that can allow people to rehearse a walking route virtually before experiencing the route in the physical world via Soundscape. Her paper, “Use Cases and Impact of Audio-Based Virtual Exploration” is a collaboration between engineers from the Soundscape team (Melanie Kneisel and Alex Fiannaca) and researchers in the Microsoft Research Redmond Lab (Ed Cutrell and myself). Melanie Kneisel will also be a featured speaker at the workshop.

In addition to presenting research on accessible Web browsing and accessible VR, researchers from Microsoft’s Cambridge, UK lab will be sharing a tangible toolkit to enhance the accessibility of computer science education for children who are blind. Led by researcher Cecily Morrison, the CodeJumper project is a physical programming language for teaching children ages 7 – 11 basic programming concepts and computational thinking regardless of level of vision. It was inspired by the need to provide a way for young blind and low vision students to access the computing curriculum inclusively alongside their sighted peers. Children plug together pods that represent lines of code in a program to create programs that when run make music, stories, or poetry. Children can start with very simple concepts—such as, a program is a sequence of commands—and progress to complicated program flows that utilise variables, covering the whole of the curriculum for this age band. It was successfully tested with 75 children and 30 teachers across the United Kingdom and found to support age-appropriate learning of coding as well as encouraging whole-child learning, such as creating friendships with sighted peers. The tangible CodeJumper kit will be available for CHI participants to experience during the conference’s demo session.

We look forward to seeing you at CHI 2019 in Glasgow and sharing ideas and advancing the accessibility conversation together.

Posted on Leave a comment

Sharing our learnings on World Autism Awareness Day

By Neil Barnett, Director of Inclusive Hiring and Accessibility at Microsoft

Autism @ Work Playbook, a guide for HR professionals and organizations to create employment opportunities for individuals on the autism spectrum.

Autism @ Work Playbook, a guide for HR professionals and organizations to create employment opportunities for individuals on the autism spectrum.

Being inclusive is not something we simply do, but rather it stands for who we are. The value proposition for diversity and inclusion within Microsoft is increasingly clear — a diverse and inclusive workforce will yield better products and solutions for our customers, and better experiences for our employees. We all need to do our part to encourage new and different perspectives, solutions, and innovative ideas to surface from all our employees.

Four years ago, on World Autism Awareness Day, we announced Microsoft’s Autism Hiring Program. Given that 80% of individuals on the autism spectrum are unemployed or underemployed, we knew there was an untapped pool of talented people who have the skills aligned to the work we are doing every day at Microsoft. Over the years, we have learned a lot about inclusive hiring and actively share our learnings with other employers interested in evolving their hiring programs.

As an outcome of working with organizations across industries, we are proud to be a founding member of the Autism @ Work Employer Roundtable. About 18 months ago, Disability:IN brought together a group of employers to create the Roundtable, where the main goal is to expand employment opportunities for individuals on the autism spectrum. Today, the Roundtable consists of 16 organizations who all have robust autism hiring programs. As part of the Roundtable, we regularly receive questions about how to get started. In response, we came together and collaborated with the University Washington Information School to publish the “Autism @ Work Playbook,” a resource for employers who are interested in beginning or expanding their inclusive hiring journey. The playbook covers topics from building your business case, recruiting and sourcing talent, going through the interview process, to training, on-boarding, and career development. This is a great resource for every organization, so please take a moment to read and share the playbook.

Throughout the year, Roundtable members along with 250 other employers, academics, service providers, and government agencies, get together at events across the U.S. to share best practices and talk about how we can work together to make an impact on changing the unemployment rate for individuals on the autism spectrum. This year we are happy to announce there will be two upcoming summits sponsored by the Autism @ Work Employer Roundtable to bring this group together. The Autism at Work Summit in May will be hosted by Microsoft at our Redmond campus followed by another summit hosted by SAP in the Fall.

Over the years, we have continued to learn, evolve, and grow our own Autism Hiring Program. We changed our approach from a traditional interview to a cohort model where candidates come to our campus for a week-long experience to demonstrate their skills, gain feedback, and meet with hiring managers. Based on interest in the program from across the company, we have also seen an expansion of the types of roles where candidates are being hired. For example, we have grown from hiring mainly technical roles, like software engineers or data scientists, to more non-technical roles like customer support. In addition, we have recently expanded the program outside of Redmond, WA, with new pilot programs at our campuses in Fargo, ND and Vancouver, BC. These are important regions for Microsoft – we have a large campus in Fargo and it’s one of our TechSpark regions, and an ongoing vision for greater cross-border collaboration throughout the Cascadia Innovation Corridor in Vancouver. We look forward to seeing the program grow in these regions and continue to expand.

Our approach to inclusive hiring is not limited to just one event, program, or initiative. We host various career fairs throughout the year, and on April 23, we will host our next Virtual Career Fair in partnership with 14 other leading companies including, Fidelity, Deloitte, Travelers, Ford and more. If you know someone that is looking for a new opportunity, please make sure to share the event, which provides opportunities to connect with recruiters, learn about open positions and submit resumes. We also fund an annual scholarship for high school students interested in technology and offer broad inclusive hiring efforts and events. If we bring organizations together to focus on inclusive hiring, we can accelerate progress towards breaking down barriers and changing the unemployment rate for individuals on the autism spectrum.

We look forward to continuing to partner with others in the near future to truly make a difference in the employment landscape, educational, and working opportunities for all individuals on the spectrum. If you or someone you know is interested in a career at Microsoft, please send resumes to and learn more about our inclusive hiring efforts at We look forward to hearing from you!

Autism at Work Virtual Career Fair featuring companies, Cintas, Deloitte, EY, Fidelity, Ford, IBM, JP Morgan Chase &Co., Merck, Microsoft, PwC, SAP, Travelers, Ultra Testing, and Willis Towers Watson.

Autism at Work Virtual Career Fair featuring companies, Cintas, Deloitte, EY, Fidelity, Ford, IBM, JP Morgan Chase &Co., Merck, Microsoft, PwC, SAP, Travelers, Ultra Testing, and Willis Towers Watson.

Posted on Leave a comment

Jenny Lay-Flurrie: 4 key pillars of Microsoft’s journey with accessibility

By Jenny Lay-Flurrie, Chief Accessibility Officer

Theo demonstrates a program he created with the technology behind Code Jumper as his mother looks on. Photo by Jonathan Banks.

Kings College Student, Theo, demonstrates a program he created with the technology behind Code Jumper as his mother looks on. Photo by Jonathan Banks.

Hi folks, writing to you from Seattle and not from the CSUN Assistive Technology Conference in sunny Anaheim as planned. Unfortunately, I won’t be able to be there in person this year as a health condition is last minute preventing me from flying down. Gutted! I’ve been looking forward to it for months. But no fear, there are 94 Microsofties from across the company in attendance and eager to share what we’ve been up to, and more importantly, to listen, learn, and foster relationships.

Since I can’t be there, thought I would take the opportunity to share what I was going to cover during my presentation – which will go ahead tomorrow with some of our super stars at the helm!

Each year, this event brings together folks from across the accessibility community to share knowledge and best practices in the field of assistive technology. Accessibility isn’t a space where we can move the dial alone, so effective partnerships are a critical element to creating products that work better for everyone. There are some great examples of this that I want to share. As a company, we’re building a partnership mindset into our DNA and into each of our four key pillars which ground us on our journey with accessibility. Enough talk—here’s some examples:

Pillar One: People

Building accessibility into your company starts by building a culture that embraces accessibility and disability. Hiring diverse and talented people with disabilities and accessibility experts brings experiences that empower and accelerate efforts to build products that work for everyone. One of the programs we launched a few years ago is our Autism Hiring Program. While this program matures, we are equally focused on how we share our learnings and best practices in finding talent with other companies with similar ambitions. Today, 16 companies are working together to learn and share together via the Autism @ Work Employer Roundtable. We wanted to make it easier for folks to know how to get started, and in partnership with Disability:IN, SAP, EY, and JP Morgan Chase and the University of Washington, we’ve developed a new playbook that walks through all the top questions you may have. Do check it out.

Pillar Two: Systems

To really drive the business of accessibility, you need a systematic framework; an ecosystem for how you empower folks to deliver. That ecosystem covers all aspects of the company, and supporting our customers is a key part. We have many other partnerships, such as Be My Eyes, which last year started providing customers a direct connection to receive technical assistance from the Disability Answer Desk (DAD), a free consumer service for Microsoft’s customers with disabilities. Since launching the Be My Eyes service in 2018, and we’ve already taken over 4,000 calls and customer satisfaction is at 90%. This partnership was a great step forward in providing people with disabilities effective support in the way they prefer, and is just the start of what’s possible down the line. We’re also excited to announce that tomorrow DAD will be launching Twitter support in five markets. Customers will now be able to send us a direct message via Twitter for technical assistance through the DAD support page.

Pillar Three: Product

When you combine an inclusive culture with an ecosystem that empowers people, and you do it in partnership, you’re set up to create game-changing products that work for everyone.

Windows 10: We are looking forward to meeting users to get feedback about the latest accessibility improvements and to share what’s coming later this year. And yes, there are announcements for Narrator, including support for Chrome. Narrator is also getting new Home Page that makes it easier to find the User Guide, Quick Start tutorial, new settings to personalize your experience, and links to provide feedback so that we can continue to focus on the features that matter most to users. Narrator is also getting improved reading functionality; it is more efficient (less verbose), more natural and more responsive with apps like Outlook. It will also see additional support for the latest translation tables and braille displays.

We’re also building on the success of larger text options by adding larger pointers. Users can make their pointer easier to see by making it larger and adding a custom color – I recommend red. We’ve also polished “center mouse mode” in Magnifier to make your pointer find the center of your display. The team is very proud of its “buttery smooth” performance!

Office 365: Our enthusiastic partners are embedded in every part of our design process—from concept testing, to post-launch feedback from users leveraging the Disability Answer Desk – a number of experts have helped to build an accessible-by-design product that we hope makes it easier for digital inclusion to be part of your organization’s digital transformation.

The ATHEN group played a particularly crucial role in the development of the new and simplified ribbon found at the top of every Word document and OneNote notebook. The simplicity of the new simplified ribbon was designed to allow screen reader users to more easily find the commands they are looking for. Turns out, it is an easier experience for everyone!

Another important advancement is with our Accessibility Checker. We have updated the rules for accessibility checker to reduce false positives making accessibility even easier to achieve to empower everyone. In addition, you can now ‘check as you go’ by letting the accessibility checker keep an eye on your document while you’re building it. This provides an at-a-glance reminder in the status bar when issues exist in the document, and a 1-click action to investigate recommendations.

AT Partnerships: Our assistive technology partnerships are critical to ensuring we can truly empower people with disabilities. We are passionate about our first-party tools, but also passionate about empowering the ecosystem at large. We’re successful not only when we build ourselves, but when we empower others to build as well. A few examples include Eye Tech Digital Systems, which integrates with the built-in eye-tracker on surface; Dolphin Computer Access’ GuideConnect, a talking digital assistant; and InsideVision’s InsideOne, a tactile tablet with an integrated braille keyboard. These are just a few examples, and we’d like to say a huge thank you to our partners for their ongoing collaboration to push the boundaries of assitive tech.

Accessibility Insights: We also have great news for developers with the open sourcing of Accessibility Insights – a tool that helps developers find and fix accessibility issues in their code. Accessibility Insights offers developers the ability to run FastPass and identify common accessibility issues early in the dev cycle and provides tips on how to fix. In addition, we partnered with Deque Systems to add Windows platform support to the axe accessibility rules engine.  Now developers can test their code in development using a common, unified approach. I really encourage you to learn more about Accessibility Insights by checking out Keith Ballinger’s latest blog.

Pillar 4: Innovation/Future

Lastly, what we crave: innovation. It motivates, inspires and drives us. Building solutions with and for people with disabilities by harnessing so much of the knowledge we have across the industry and working in partnership with the community to ensure what we built has impact and purpose. Two projects that epitomize this:

Seeing AI: Designed for the blind and low vision community, this research project and free mobile app harnesses the power of AI to describe people, text and objects. The team announced new features and functionality inspired by feedback and recommendations provided by the Seeing AI user community. These updates are all live today – check them out!

  • Explore photos by touch feature: tap your finger to an image on a touchscreen to hear a description of objects within an image and the spatial relationship between them.
  • Native iPad support: For the first time we’re releasing iPad support, to provide a better Seeing AI experience that accounts for the larger display requirements.
  • Channel improvements: Customize the order of your channels, access face recognition function while on the Person channel, get audio cues while analyzing photos in other apps!

Code Jumper: Lastly, Code Jumper is a physical programming language for kids between 7 and 11 with all ranges of vision. It started as a Microsoft research project in the UK, but as it evolved the team worked to create a path to manufacture at scale. The research and technology behind Code Jumper is now in the capable hands of the American Printing House for the Blind (APH), a nonprofit based in Louisville, Kentucky, that creates and distributes products and services for people who are blind or with low vision. Over the next five years, APH plans to offer Code Jumper and related curriculum to students throughout the world. Learn more about the project here.

Check out this post for all the Microsoft sessions this week.  Our accessibility engineers, program managers, and world-class researchers are excited to talk to you.  Again, I wish I could be there with you!

For more information and to stay up to speed about Microsoft accessibility, visit 

Posted on Leave a comment

Empathy and innovation: How Microsoft’s cultural shift is leading to new product development

The young Microsoft software engineer had just moved to the U.S. and was trying her best to stay in close touch with her parents back home, calling them on Skype every week.

But their internet connection in India was poor, and Swetha Machanavajhala, deaf since birth, struggled to read their lips over the glitchy video. She always had to ask her parents to turn off the lights in the background to help her focus better on their faces.

“I kept thinking, ‘Why can’t we build technology that can do this for us instead?’” Machanavajhala recalled. “So I did.”

It turned out her background-blurring feature was good for privacy reasons as well, helping to hide messy offices during video conference calls or curious café customers during job interviews. So Machanavajhala’s innovation was integrated into Microsoft Teams and Skype, and she soon found herself catapulted into the spotlight at Microsoft – as well as into the company’s work on inclusion, a joy to experience after having been excluded at a previous job where her deafness made it hard to fully participate.

Software engineer Swetha Machanavajhala poses with her parents in front of the Taj Mahal in India.
Microsoft software engineer Swetha Machanavajhala and her parents. Photo by Swetha Machanavajhala.

Microsoft employees say those twists and turns of innovation – aiming for A and ending up with a much broader B – have become more common at Microsoft in the five years since Satya Nadella was appointed chief executive officer.

Nadella’s immediate push to embolden employees to be more creative has been exemplified by the company’s annual hackathon. Machanavajhala and others say the event has helped spark a revival where employees feel energized to innovate year-round and to seek support from their managers for their ideas – even if those have nothing to do with their day jobs.

“The company has changed culturally,” Michael A. Cusumano, a professor at the Massachusetts Institute of Technology’s Sloan School of Management who wrote a book about Microsoft 20 years ago, recently told The New York Times. “Microsoft is an exciting place to work again.”

Chris Kauffman, a marketing manager in product licensing who has worked for Microsoft for 13 years, said Nadella’s focus on fostering collaboration was a turning point for her, as she noticed silos being torn down. Kauffman also realized the advent of artificial intelligence (AI) could help business people like her broach the realm of engineers and IT specialists. She and her team capitalized on both of those developments to create a chatbot and virtual colleague, answering thousands of licensing questions from around the world and helping to handle the accelerated pace of Azure cloud computing service updates.

“I went to my first hackathon three years ago and fell back in love with Microsoft,” Kauffman said. “I realized that I now have permission to talk to anyone I want to. I’m no longer limited by my job function or level. And my experience with the chatbot is a great example of how technology can be democratized and used by everybody.”

That new openness has led to an explosion in new products or fine-tuned improvements across Microsoft, for customers as well as for internal use. Employees say the resurgence is showing up both in product improvements and internal events such as TechFest, an annual showcase of Microsoft research that takes place in a few weeks.