Helping translate signs and symbols into spoken language
Makaton is a language programme that uses signs and symbols to help people who have no speech or whose speech is unclear. It helps them communicate when they have cognitive impairments such as autism, Down syndrome, and neurological disorders such as stroke patients.
But care homes for adults with learning difficulties have high staff turnover and staffing costs. It can be expensive and time-consuming to train staff to use Makaton, and communication can become frustrating instead of engaging.
Electrical and Electronic Engineering students at Imperial College London worked with IBM experts and technology, and a care home for people with learning disabilities, to create Waldo. This cute and endearing prototype provides a streamlined, cost-effective way for care homes to provide high-quality care for Makaton users.
It translates from Makaton sign language patterns into the spoken word. You can press buttons on Waldo to vocalise pre-set phrases in a quick and convenient manner. And Waldo's ambient sensors help the carer and medical professionals understand the user's environment and mood.
The students used IBM Watson Text-to-Speech on a Raspberry Pi as part of their project, implementing machine learning on an Nvidia Jetson Nano. Interactions are logged and recorded in a database in the IBM Cloud, and later analysed for any patterns.
Come and meet the development team - and say hello to Waldo, of course.