The Rise And Rise Of Omnipresent Digital Assistants

The author of the TechRadar piece makes the point about the assistants needing to understand more complex things beyond just location. Don't get me wrong, location awareness is really important for lots of reasons but there are other more useful sensing capabilities.

TechRadar recently asked 'what happens when our assistants are smarter than us?'. The author made a good point - that these assistants, like Alexa, OK Google, Siri, Bixby, et al are all still the first generation and are likely to improve - but mixes up the physical device (like the Echo or HomePod) with the digital assistant. At the moment, digital assistants are associated with particular devices but once Apple's HomePod arrives and other companies integrate Alexa or OK Google into their products, consumers will start to understand that these assistants can be in their pocket and in their home and are no longer tied to a single form factor.

But, if these digital assistants are going to reach into every part of our lives then they would be much more useful if they better understood the world around them as well as the commands we use.

The author of the TechRadar piece makes the point about the assistants needing to understand more complex things beyond just location. Don't get me wrong, location awareness is really important for lots of reasons but there are other more useful sensing capabilities.

The broad consumer technology sector talks up artificial intelligence but if we want to create products capable of responding in human-like ways then we have to teach them to understand context and the world around them in the same we do - through our senses. Products respond to our touch, computer vision technology can recognise us visually, there are companies like Owlstone developing technology that can 'smell' and 'taste', and we have products that can respond to our voices. But there is so much more information that we can extract from the things we can hear but not see.

Understanding specific and significant audio events or having a better understanding of an audio environment allows the digital assistant - whether running on a smart speaker, phone, car, headphones or wearable - to react based on what it knows and what it believes that you want or need. This means that we have to provide those local sensing capabilities within each environment and product, which in turn provides the digital assistant with information that it needs to provide its holistic, multi-device services.

For example:

  • If my personal assistant knows that I've been sneezing at home, in the car, at the office, and it knows that I'm about to head home, and it knows that the pollen count is forecast to be high, then it will be able to turn on the HEPA filter at home and in the car before I arrive. And it will do these things automatically.
  • If my assistant hears that my baby is crying in the night, it can turn on my hallway lights to a low hue so that I don't trip over, play a soothing lullaby through the smart speaker to help my baby back to sleep, and maybe buzz my wearable to help me respond quicker.
  • Or perhaps my personal assistant knows that I'm cooking in the kitchen and automatically changes the sound equaliser of my smart speaker so that I can hear my music above the crashing and banging of pots without me having to ask it.

I think that as more context-sensing capabilities like hearing are embedded within devices, these cloud-based personal assistants will become more intelligent and more helpful. As a result, the mainstream consumers (who are starting to embrace smart homes and smart speakers) will see how this technology can be really helpful to us without attempting to take over the world or surpass our own intelligence.

Close

What's Hot