Since we last spoke (see original interview), you finished your MSc and have started your PhD in Conversational AI. What are you focusing on?
That’s right, I really enjoyed working on conversational systems during my MSc and came back for more! There is a great team of experts at Heriot-Watt so I’m very lucky to work alongside them.
I am investigating how to make conversational agents such as Siri, Alexa, Cortana and Google Assistant more natural to talk with. I am funded by The Data Lab and Wallscope and our plan is to use the results of this research in the healthcare sector.
What’s wrong with current conversational systems?
Human conversations contain many phenomena that are completely ignored by current systems and often actually confuse them. We will correct ourselves mid-sentence, talk over each other, ask questions like “what?” assuming the other person knows what we are asking and give feedback while listening.
While someone is talking, we will say things like “yeah”, “mhmm” and “really” without really thinking about it but this feedback guides our conversations. The person speaking knows that we are listening, we understand what they are saying or we are confused which is really useful information. I will give you a link to a really good example conversation on Youtube that contains all of these things.
We don’t just verbally give feedback, we do things like smile, nod and screw up our faces for the exact same reasons. If I told my grandma that I went to see Ice Cube for example, she might screw up her face and I would then explain who he is. This interaction is short but I changed what I was saying without my grandma uttering a word.
All of this information is ignored entirely but really needs to be taken into account if these systems are to have more natural conversations like we do.
You mentioned Healthcare, who will this help?
Good point, it may seem pointless to make these systems have conversations like us and this is why current ones ignore all the signals I mentioned. This is extremely important and worthwhile however if the users are not the “mass market”.
Our population is ageing very quickly. In the next 25 years, the National Records of Scotland project that the number of people aged 75 and over will increase by 79% whereas the working population will only increase by 1%. This is startling when the limited number of caregivers are already stretched thin. Not only will the number of people in this age group increase, they are increasingly living alone and diagnosis rates of age related diseases, like Dementia, are accelerating.
We can adapt to current conversational systems but this growing group in our ageing population struggle - these systems need to learn to adapt to them instead.
Ok, so how exactly will your project help these people?
Loads of devices exist that make our lives easier and these can also help people live in their own homes longer. Smart doorbells for example are brilliant for those that live alone with severe arthritis. But with smart ovens and lights and heating and on and on, each with regular new features - accessibility is difficult. If you give someone a touchscreen with all of these functions, they will struggle to use it so now imagine giving this to an elderly person with Alzheimer's disease.
Once conversational agents become more human-like, all of these systems can be controlled in the most natural way possible - voice. This will lower the barriers around these devices that can truly assist people.
With an ageing population and limited number of caregivers, our system could make a real impact on the improvement of social care.
Sounds great! As you’re funded by a company, will this all be private?
No, thankfully it won’t as I’m a big advocate of knowledge sharing and collaboration. Heriot-Watt, Wallscope and The Data Lab have agreed that everything will be released open source which I am very excited about! In addition, I will not only release academic papers but posts on Medium as I go along. These will range from discussing how humans interact to computer vision, linked data and natural language processing.
And the last word has to come from David Eccles, Angus' boss and mentor!
"We first met Angus through Company Connecting and how the wheels have turned. Then he was younger and could not grow a beard. He has matured but needs to learn to shave. We have sent him back into Education to learn that and to improve his speech when he is drunk."
To be featured or find out more: