Assistive Technology

The Future of AAC and XR

A boy wearing an Augmented Reality headset.

If you have no idea what the term AAC means, you aren’t alone. AAC is not in the mainstream consciousness of most people even though it should be. AAC stands for Augmentative and Alternative Communication. This means people use ways other than speech to communicate.

AAC has been around for centuries in one form or another, like gestures and sign language, but modern AAC has developed over the past one hundred years or so. AAC as a physical device came about around one hundred years when the first communication board was created. Over the subsequent decades, different methods were developed to allow people to communicate either with assistance (aided) or on their own (unaided). The electronic AAC device evolved starting in the 1960s and in the past ten years has really come into its own.

The Current State of AAC

An iPad in a red case running the AAC program Proloquo2Go
The AAC program Proloquo2Go on an iPad.

There are dozens if not hundreds of options for AAC these days. You can use the traditional AAC style boards or go with a myriad of electronic or computer-based programs. In fact, you can run AAC right on your iPhone or iPad. With the advent of the internet, communicating in different ways has become easier but still not ideal for everyone.

Challenges with AAC have been multi-faceted: supporting sufficient vocabulary, effective access with or without support, physical adaptations, learning curves, and social and cultural acceptance.

AAC of the Now and Future

One of the challenges of AAC for many is the fact that you need to carry a board or a tablet or computing device and touch it, use electronic switches, or eye tracking from a distance. This works well but can be problematic for some people or in some situations. Say you are walking and carrying things but need to speak using your iPad. Or you have a motor disability and require a wheelchair and using a tablet with keyguard, but access can be slow and frustrating.

There is a new wave of computing that is using BCI—brain computer interface—to give more people access to computer control to speed up and enhance communication. AAC is taking advantage of this in creative ways.

A boy and his mother trying on the Cognixion One AAC headset. A man is adjusting the headset on the boy's head.
My son and wife trying the Cognixion One AAC headset.

One up-and-coming company is Cognixion. They have created an augmented reality (AR) headset that includes BCI via EEG (electroencephalography) sensors. The Cognixion One uses custom EEG sensors along with an AR visor to display the AAC application to the user hands-free.

AR projects 2D or 3D imagery in front of the eyes as a heads up display so the user can see both the information and their surroundings at the same time.

EEG uses electrodes on the outside of the head to sense electrical brain activity. These have traditionally been used for neurological evaluations in the medical field. Over the past few decades, researchers have studied the use of EEG to control computers—brain computer interface. This turns the brain into a switch of sorts, giving the ability to control a computing device without a keyboard or mouse. Once the brain is trained, it can work like a hands-free keyboard and mouse.

This focuses on two of the challenges of traditional AAC: carrying an device and physical access. They are also rethinking the user experience of AAC in the headset.

My son and wife got to try the Cognixion One. My son, who has cerebral palsy, communicates with minimal speech and relies on sign language, gestures, and an iPad with the program Proloquo2Go. Using something like the Cognixion One could open up more options for him to connect with others.

Imagine the Possibilities

This raises the question I’ve been asking myself for a while: how do people with disabilities engage in XR (extended reality—augmented and virtual reality) and the newfangled metaverse? While many of the common AAC methods could work in AR, not many or really any have been adapted to the medium. This needs to happen and needs to happen now.


What if everyone was able to communicate with each other seamlessly? That is no longer a pipe dream. We have the technology. We have the willingness. We even have some of the social and cultural support. We just need it all to come together. I feel this will happen in the coming years and decades, if not sooner.

Also See

Let’s Make XR Accessible

XR Access

XR – eXtended Reality – has been around for decades. Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) allow for much more three-dimensional and realistic interaction with computing systems. As with most technologies, they have not been designed for accessibility from the start. But can they be used to enable interaction with people of all abilities?

Why should XR be accessible?

The better question is: why shouldn’t XR be accessible? Accessibility is a human right. It is also the law in many countries, including the United States, with laws such as the Americans with Disabilities Act and the Telecommunications Act.

XR, including AR and VR, came out of varying needs for 2D and 3D spatial computing over the years. Simulation, training, operations, and maintenance are a few of the common uses of XR. Gaming is also a common use of VR in the consumer space, dating back to systems like Nintendo Virtual Boy in the 1990s and made more mainstream by the HTC Vive and Oculus Rift in the 2010s.

AR started with the Virtual Fixtures system for the US Air Force, and earlier “heads up displays” (HUDs) are also predecessors to the modern AR headset. AR has evolved into both 2D and 3D with HUDs such as Google Glass, Microsoft HoloLens, and Magic Leap One. Phones and tablets can also run AR using cameras to detect and overlay imagery over the world.

Evaluating this holistically, XR is a perfect fit for making the world more accessible to everyone. Uses such as communication, training, navigation, remote access, and content creation can provide enhanced and alternative ways for anyone to participate.

We also need to make sure XR experiences are accessible for everyone. VR games should be accessible. AR navigation apps should be accessible. MR remote 3D design platforms should be accessible. It should not be an afterthought.

The XR Access Symposium

In July 2019, I attended the XR Access Symposium in New York City. This group of about 100 people across academia, industry, and government got together at the Cornell Tech campus on Roosevelt Island for a day of presentations, demos, and breakout groups.

The room of about 100 people who attended the XR Access Symposium in July 2019

The plenary sessions got everyone thinking and the ten breakout groups allowed us to brainstorm the initial set of goals that the XR Access initiative would consider. There were demonstrations of current XR tech which was new to a lot of the audience. The mix of content, devices, and people enabled unique conversations that led to the next steps for XR Access.

How we make XR accessible

The goal for XR Access is to engage with the community, create guidelines, and influence policy for accessibility in XR. The focus is to support software, hardware, and content so that this technology is built from the ground up for all, not application by application.

To achieve this goal, we have organized the XR Access initiative into six working groups:

  • Guidelines & Policies
  • Awareness
  • Education
  • Hardware Devices
  • Content & Authoring

We have about 150 participants across the groups and an executive team of about a dozen leaders. I am the Lead for the Hardware Devices working group and excited to engage with our community partners on this effort. I am also looking forward to working across all the groups to meet the initiative’s goals.

There are some efforts in progress, including W3C’s XAUR and WebXR, along with Open XR and the XR Association that will feed into our work. We are not starting from scratch but acknowledge that a more coordinate and public effort needs to be made.

Our next symposium is slated for sometime this summer. Please visit and join a working group or attend a community gathering to get involved.

Accessible Gaming: The Xbox Adaptive Controller

Xbox Adaptive Controller and Logitech kit

In September 2018, Microsoft released its Xbox Adaptive Controller to the world. This new controller provides alternative ways for gamers to interact with both the Xbox and Windows games. The controller has its own built in controls for some of the main buttons and inputs for up to 19 switches that use the 3.5mm standard and three USB ports.

Xbox Adaptive Controller and Logitech kit

Microsoft started development of the Xbox Adaptive Controller way back in 2015, and it took three years of iterations and collaboration to get to production. The company worked with gamers to develop and tweak the design, and it ended up being compatible with a wide array of switches, joysticks, and mounts and is massively customizable.

My son playing on the Xbox Adaptive Controller

Both of my boys love gaming. We got an Xbox One S last year and while my older son was quick to adapt to the new system, my younger son had challenges. He has cerebral palsy and had trouble using the standard Xbox controller for more complex games like NHL ’18. He loves hockey and really wants to play with his brother. So we got the Xbox Adaptive controller for Christmas along with the brand new Logitech Adaptive Gaming Kit.

The whole point of devices like this is to give people the choice of modifying input methods to fit their needs. You can use the base controller and a few 3.5mm switches, or a whole array of switches along with USB joysticks. There are switches for wheelchairs, hand mounts, foot controls, mouth controls, and more. You can remap the inputs any way you like. This is what true accessibility is about.

The Xbox Adaptive Controller gives my son the ability to interact with the games in his own way. The larger buttons of the Adaptive Controller are easier for him to push. The customizable switches allow us to space out additional button controls in a more accessible layout. The last step is to add some USB joysticks—this will allow complete control in NHL ’18, for example.

Microsoft is the only major game system maker that has done this in a comprehensive way, though adaptive controllers have been around in various forms for decades. My hope is that the success of Microsoft’s effort will trickle over to the other companies. I am looking at you Sony, Nintendo, and Apple.

One great effect of these new adaptive devices is their eventually adaptation (*ahem*) for other uses, including PC control and an area I am currently working in, XR Access: accessibility for virtual reality, augmented reality, and mixed reality systems, aka eXtended Reality (XR).