6 Surprises From 6 Months With Alexa
At Untapped, we use a lot of different research techniques to help clients find out how people actually use and integrate new products and emerging technologies into their lives, and what attributes and features can optimise their experience. So, it’s interesting to reflect on my personal experience of this integration process with one of the “technologies of the moment” – the Amazon Echo.
All the predictions point to Voice Assistants being at a tipping point where they are about to become ubiquitous in our lives (https://disruptionhub.com/2018-voice-based-assistants/). Some research even estimates that over 50% of consumers in Germany, UK and France already use voice assistants.
It’s now been 6 months since I got an Amazon Echo. While the voice activation feels both more instinctive and immediate, these are the surprises about how I’m actually getting on with Alexa ….
- About 90% of how I use it is as a voice-activated speaker when I’m busy
Alexa is my cooking companion. It’s hard to scroll through Spotify when your hands are covered in chopped tomatoes, so this is where she really comes into her own. I like being able to ask for a playlist, a radio station or even ask for music suggestions as I’m preparing food, without having to stop. But interestingly, I still route the request through Spotify (“Alexa, play me X from Spotify”) rather than just use the Amazon music portal. Partly because I want it to be a seamless extension of my phone (where I use Spotify), and partly because I trust their music algorithm more.
- I use it less than I thought to do simple searches on news, weather, information
If I need to find something out quickly, my first instinct is still to use my phone, not call Alexa. Why? I think that I still miss the screen. Somehow, I don’t take in, retain or trust the spoken information as much as visual information on my phone. If I ask, “When does my local Sainsbury’s store close?”, I find myself saying “Hang on, what did she say? What time does the store close?” It’s probably because the answer you hear contains a whole list of opening and closing times, so the pertinent information is lost amongst other things that you didn’t ask. This is the kind of filler information that your eyes filter out when you look at a screen – you very quickly sort and focus. It’s harder to do that when you hear things.
- It’s really good to give you reminders.
Although it’s not as instinctive as a phone for “looking up” information, it is far more instinctive when I need to remind myself to do something later. When I suddenly remember that I must call someone later or take laundry out of the machine, it’s far easier and intuitive to ask Alexa to remind you to do it than it is to bring up the calendar app on your phone and type it in. It’s that instinctive, in-the-moment way that you’d ask someone to remind you to do something – only now, that someone is Alexa. And she genuinely never forgets (unlike, say, a distracted partner or teenager!)
- I do use it as a Shopping List Compiler but still take the list to the actual store….
Just as Alexa is great for those “before I forget” reminders, it’s also really good for compiling shopping lists. So, if you’re cooking and use up the last of the herbs, or if you’re just walking round the kitchen trying to think of what you need, telling Alexa is easy, instinctive and convenient. But then I take Alexa’s list (via my phone) to the store, rather then directly order through an online grocery store. Partly this is because I’ve usually left it too late to wait for a delivery, but partly also because I still feel a lack of trust – my spoken list has just “disappeared” into the ether and if I leave it to Alexa to sort it out with Ocado/Tesco/Waitrose, will they get it wrong?
- I get way more frustrated with it than I do with my phone
OK, so I’m quite impatient by nature and I hate negotiating with any form of technology or product! But when my phone freezes or buffers, I generally just shrug it off as an inevitable glitch of technology and just wait for it to re-set. But when Alexa freezes, or churns out some nonsense in response to my question, I get genuinely quite annoyed, to the point where I speak to it crossly! This illustrates to me again the subconscious difference between a “human” voice request and a typed “machine command” to a phone. Somehow, by verbally askingAlexa to do something, it’s far more frustrating when she lets you down. Ridiculous I know, but I have been known to tell Alexa that she’s useless and that she’s annoying me (to which she apologises by the way…which really doesn’t help!)
- I know it can learn from me to become more useful, but I still haven’t invested in teaching it (yet)
I get reminders and emails about new Alexa skills. I get encouragement to have Alexa tell me jokes, or give me fun facts (Why? Who has time for this?). I know that the more I use it and interact with it, the more “honed” to me it can become. But in reality, I’m not yet ready for Alexa to pro-actively give me information or recommendations. I just want it to react to my command or need at that moment. Perhaps this will change when I can see that by teaching it, it can make a meaningful difference in getting a chore done or simplifying my life. But making the reward worth the effort still feels like it’s some way off for now.
One final insight – clearly, subconsciously, I can’t make up my mind if the Echo is an “it” (technology) or Alexa (human)! It can depend on my mood or the task I’m addressing. At Untapped, as we work more and more with emerging technologies, we often hear people express this human-tech “split personality”. The most seamless tech products that people bond with strongly will be those that offer the power of technology with just the right amount of intuitive, human elements.