Emmett’s mom tells him every day about the impact he’s going to have on the world.
“I tell Emmett…you have accomplished more as a 4-year-old than I ever will in my entire adult life,” Breanne Connors said. “However much longer I get to live, there’s no way because he’s set the bar so high.”
After Amazon’s accessibility team began visiting a Birmingham preschool that serves children with disabilities several years ago, they began thinking of ways Alexa, the company’s virtual assistant, could be more accessible for people like Emmett. He has speech and mobility disabilities.
“He’s got a very dark sense of humor, he loves chaos. Everyone thinks their kid is the greatest kid in the world, but it’s so humbling to have total strangers be like, yes, this kid’s got a story to tell and identify that like in the first five minutes of meeting him,” said Breanne Connors, Emmett’s mom. “He just locked eyes with Juliana and made this all happen on his own.”
Emmett cannot use Alexa by speaking or tapping on a tablet. After multiple visits to the school, the Amazon team, led by Juliana Tarpey, developed Eye Gaze on Alexa, which is referred to inside Amazon as “Project Emmett.”
“We understand the needs of people with speech or mobility disabilities are diverse, which is why we worked with speech-language pathologists as well as individuals like Emmett, who sparked the idea for us to build this functionality,” Tarpey said.
Eye Gaze on Alexa allows people to perform actions entirely hands and voice free, such as playing music and shows and controlling their home environment. The technology tracks where a user’s eyes are looking, which indicates which command to perform.
“Emmett sparked a lot of the initial ideas for the technology’s potential by showing our team how he could use his eyes to communicate with people. We thought, if people with mobility or speech disabilities could use their eyes to communicate directly with Alexa, it may give them more independence in doing everyday tasks like playing music or making calls, without having to rely on a caregiver,” Tarpey said.
Emmett’s school, Hand In Hand early learning program, is run by United Ability, a local nonprofit that serves children between the ages of 6 weeks to four years old. About 40% of Hand in Hand’s students have a disability. Each of the school’s 10 classrooms are integrated so students of all abilities can learn and play together.
The Amazon team visited the school multiple times to work with students and the school’s team of physical, occupational and speech therapists and to troubleshoot the technology and adapt it to the different ways people manifest speech and mobility disabilities.
“Emmett sparked a lot of the initial ideas for the technology’s potential by showing our team how he could use his eyes to communicate with people. We thought, if people with mobility or speech disabilities could use their eyes to communicate directly with Alexa, it may give them more independence in doing everyday tasks like playing music or making calls, without having to rely on a caregiver,” Tarpey said.
Emmett’s school, Hand In Hand early learning program, is run by United Ability, a local nonprofit that serves children between the ages of 6 weeks to four years old. About 40% of Hand in Hand’s students have a disability. Each of the school’s 10 classrooms are integrated so students of all abilities can learn and play together.
The Amazon team visited the school multiple times to work with students and the school’s team of physical, occupational and speech therapists and to troubleshoot the technology and adapt it to the different ways people manifest speech and mobility disabilities.
“I can’t stress enough to the Amazon team — you guys have no idea what it’s like to be on the other end of this waiting game. Insurance doesn’t wanna pay for anything. And that’s if you’re fortunate enough to have insurance coverage, and this Amazon technology is accessible to most people.”
Emmett’s parents got to sit in with the development team as they worked with their son. They asked how long Emmett would need to lock eyes for Alexa to perform an action and the team asked for feedback about the design and how easy it was for the family to use.
“They were asking us how we can use this, where would we use this, what the benefit is. They really wanted to get as much feedback from us and kind of see it from our side of things as they could,” said Michael Connors, Emmett’s dad.
The Connors say the new program will change the way they’re able to communicate with their son and what he’s able to do at home.
“It’s just little things like the movie night we rotate through who gets to pick the movie and now Emmett can use Eye Gaze on Alexa to pick his movie. Or on days when he’s home from school he can pick the music he wants to listen to. It just gives him a chance to have some independence and some choice in a lot of the day to day stuff that we have,” Michael said.
Talking about their son’s role in developing this technology and the impact it will have on people around the world with disabilities brings tears to Michael and Breanne’s eyes.
“It’ll never stop choking me up. I couldn’t have asked for a better thing to have happened from the situation we’re in. Just knowing the impact he could have – once this is out there, people will be able to use it forever,” Michael said.
The Eye Gaze technology was announced this fall and is available on the Fire Max 11 tablet, but the Amazon team will continue to adjust the product based on feedback from customers and partners, including United Ability, according to the company “Our continued partnership with United Ability has been a wonderful experience, and we look forward to what we can do together in the future,” Tarpey said.