On the 5th of December, we received the invitation below to participate in the AI for accessibility challenge here in Australia:
“You may have already heard that we have launched the Microsoft Australia AI for Accessibility Challenge and I wanted to reach out to you personally to invite you to take part. I know you and your team can bring some inspiring ideas and innovative thinking to this challenge and I hope you’ll get involved.
There are no limits to what we can achieve when technology reflects the diversity of all Australians. With close to four million Australians living with some form of disability, we all have an opportunity – and a responsibility – to build products and services that are designed to help people of all abilities to thrive.
The Microsoft Australia AI for Accessibility Challenge is focused on creating a more inclusive culture through the use of AI developed technology. The aim is to accelerate the development of accessible and intelligent AI solutions.”
As the Azure lead for Data#3, I was asked for input for our decision to participate. For me this was a simple “Yes” decision. Although we are not a software development company, we do know software, human application of technology and Azure-hosted AI solutions.
One day later, we received the green light from senior management to step away from billable work for a period of time to develop an idea. Our goal was to find the most impactful and suitable solution that would dramatically improve the quality of life for a specific impairment.
On the 12th of January, we sent out a Data#3 internal email asking for team members and ideas.
“The challenge is open to both technical and non-technical individuals, for this initial phase we are looking for an idea. This is where you can add the most value, the idea has to have a significant human impact and greatly enable someone with a particular impairment. You do not need to know how to design or build the solution to be part of the idea, you just need to think about a statement like the below example.
Example: We could build an application that can read hand gesture-based sign language and convert that to text or audio. This would allow a person that can only communicate using sign language with a person that does not understand sign language. The second person could reply back with voice and the app will convert that to text for the first person to read.
Once we have selected an idea based on the degree of impact and benefit then we will move into stage 2 and develop a prototype which will be presented to Microsoft.”
We created a Microsoft Teams site to start collating ideas and participants and got to work. Our initial challenge was that our team came up with 7 compelling and applicable ideas, but we could only select one! We kicked off a decision tree poll and ranked each idea in order of applicability, complexity, inclusion, impact, benefit and risk.
On the 11th of February, we had our idea – The sign language translator. Arguably the most difficult idea and solution, but nothing easy is worth doing right?
Through Teams we gathered each week to ideate and formulate our stage 2 submission for judging. With full support and understanding from Data#3 and without impacting the delivery of our current customer engagements, we proudly developed a finalised presentation that we submitted 2 minutes before the closing deadline of 5PM on the 15th of March!
On the 22nd of March, we were notified that we had successfully moved to stage 3! This was a huge moment for us as that meant presenting our idea alongside 9 other submissions to a judging panel at Microsoft in Sydney. We had just 7 days to refresh our content, organise travel from Brisbane and shuffle meetings.
For the initial submission, I used a video of a person signing “Translate” to demonstrate the exclusion for a non-sign speaker when a sign-only speaker is trying to communicate. Without sign understanding it is impossible to understand, and that was my main point. For those without a voice, communication is not bi-directional and the key component that we aimed to solve revolved around opening up effective two-way communication between the impaired and non-impaired. The next slide consisted of the same video, but with a text and vocal overlay to demonstrate how AI can be applied to bridge the gap.
When I reviewed the content, I felt that we could do better than a mockup. So, our Dev team started working on an actual translation prototype. Using 10 year old spare technology, our team built a working 5 word understanding demo using a Windows 7 Laptop (Due to no SHA driver with USB port 2.0 support for anything later), XBox 360 Kinect and the generation 1 Kinect developer kit.
The time to live was around 6 hours with the bulk of the time spent sourcing a Kinect, adapter cable and finding an old device that would support Windows 7, getting the drivers and SDK configured, in addition to, working on codes to translate the body gestures.
I was blown away that legacy tech could be used to perform rudimentary translation without using Azure AI at all. We recently heard about the new Azure Kinect with it is a massive generation leap over the 360 with a better hand-tip tracking and we grew even more confident and excited that we could build a fully working translator using today’s AI technology like computer vision for image processing and detection and language translation to make English sentences with translated images.
We submitted our slide deck the day before the pitch day and flew down to Sydney to prepare for the event.
The day – 3rd of April
We were the second group to present and the entire experience was by far the highlight of my IT career. Never before have I had the privilege of meeting so many like-minded technologists who have taken a problem, applied empathy and built out an idea that will increase the quality of life for many of the 1 in 6 Australians that have an impairment. The level of passion, energy and empathy was incredible and I applaud each of the other participants immensely.
For me personally, I learned a lot about the beautiful language of Sign and as a personal challenge I will continue to teach myself AUSLAN. However, I would also love to have a translator for everyone.
This is not the end for us, we are planning on our global submission for our idea and also this is an opportunity for you the reader. This event was not a one off and you can submit an idea that can assist those with an impairment at any time. Follow the hashtag #AIForAll, contact one of us at Data#3 or Microsoft directly. If your idea has merit\impact and a sprinkle of AI then who knows? You could receive assistance in building a solution that can improve the lives of those that were not created equally.
Tags: Artificial Intelligence (AI), Data#3 Community, Microsoft, Microsoft Azure