0800 612 9890


Wednesday, 04 April 2018

 This week on the Beattie Voice podcast:

  • Alexa is integrated into the new Toyota Corolla
  • Paid search marketers learning how to deal with voice searches
  • How voice technology affects people with speech impediments
  • Transferring money with voice commands
  • Voice bug making Iphones less secure
  • Alexa introduces follow-up mode

Toyota Corolla - Alexa Integration

First up, Toyota have unveiled that they are to incorporate Alexa and Apple carplay in the Corolla 2019 at the New York International Car Show.
They have pretty much incorporated every other type of advanced safety tech with radar assisted cruise control, lane tracing, wireless iphone charging and tons of other features. The one thing that isn't there is Android Auto. Between Alexa and Apple, there should be enough for most drivers.
This is really great. Research came out last week confirming that people are using voice commands when they drive more than anywhere else.
If people are interacting with Alexa which is rooted in the car, rather than them messing about with getting a phone to answer them, then it's got to be a good thing.
Yes, keeping drivers eyes on the road has got to reduce accidents. I hope this becomes a standard that other car makers adopt.
The Verge

Voice Search and PPC

The Drum has a very insightful article about PPC and what the rise of voice search means for PPC. It is essentially saying that neither Google or Bing lets us know which search is text based and which is voice.
To this end marketers are having to go through search query reports and try to manually identify which keywords are keyboard based and which are voice, and with that knowledge they can then see what the behaviours of those users are.
There must still be a lot of guesswork involved. I mean how can you actually tell?
Well, currently you can't, so marketers are looking for "near me" type terms or much longer tail questions.
So why is it important to separate these out? IS there any difference between the types of users?
Well that's what everyone is trying to find out. Does your conversion rate go up or down on a voice search in Google, how does click-thru rate change, and does that affect quality score etc. And knowing that, do you include or exclude the voice search indicators as negative keywords?
When you think about it, it's really non-transparent that these big PPC platforms are not giving marketers this information natively without all the guesswork. Is there information that they don't want us to see.
That's a big question. In the meantime we'll need to keep manually examining these things to get data until Adwords and Bing actually open up the info.
The Drum

Voice Search - What About Those With Speech Impediments?

OK, and speaking of identifying voice searches, what if you have a voice impediment?
Yes, Gizmodo has been examining this. For all the talk about voice search and voice assistants, there is a small percentage of the population feeling frustrated by it all because of conditions or disabilities like having a stammer.
It's not something that you would think is that difficult for voice search platforms to be able to deal with. There must be programming that they can put in place to recognise the patterns and react appropriately, however for the people who do have a problem with a stammer, the future is stretching away from them and kind of leaving them behind.
Yes, especially with car manufacturers like Toyota we mentioned earlier building it into cars as standard now - it's not just phones and standalone voice control units that you choose to own. It really looks like it is going to be everywhere and unless the technology is setup to allow maximum accessibility then we are going to have real problems for a lot of people.
Let's hope that the boffins are working on making voice much more accessible. 

Money Transfers by Voice

Now, we mentioned in a previous podcast about a bank who was pioneering checking your balance with voice. Well what about actually sending payments quickly and easily by voice?
Well, that's actually what Google Assistant is introducing at the moment. You can tell Google assistant to pay someone using Google pay, and after verifying your ID with your fingerprint or pin, it will make the payment. Both users need to be Google Pay users. If the recipient is not, then they will need to sign up to receive the payment.
I think this is a big step forwards, and another step that apparently takes banking services away from the banks and into the hands of the big online providers, kind of like PayPal did a few years ago.
So Adam, I think you should test this by sending £100 to Daniela Young.
Yeah, hang on a minute, it's only on Google Assistant so far. Still has to come to Google Home. Maybe we'll test it when that happens.
Business Insider

Apple Voice Bug

And talking about personal assistants, Apple have been through the ringer this week with Siri.
So apparently users can choose to hide messages that arrive so they can't be read from the lock screen. Sounds like standard security.
Well, it should be standard, but apparently by just saying "Siri, read hidden messages" to the locked phone, Siri would oblige and spill all the beans.
OK, this would annoy apple users a bit then?
You bet. There has been a lot of chatter online about it and no-one is hugely happy about it. Apart from Google.
What are Apple doing about it?
Well, according to apple, "We are aware of the issue and it will be addressed in an upcoming software update." It's a pretty vague answer and doesn't say whether it’s the next OS update or a subsequent one. Until then iphone users are going to have to keep an eye on other people talking to their phones.
The Verge

Amazon Alexa Introduces Follow Up Mode

Finally, have you ever found yourself a bit frustrated when you ask Alexa to do something then have to wake her up immediately afterwards to do the next thing?
Yes! Usually I keep speaking to it, not realising that it is not listening to a word I am saying.
OK, well Amazon have released an update that they are calling Follow-up mode. This means that you can make multiple requests to Alexa without having to say "Alexa" for every request.
When the mode is enabled, you'll see Alexa's blue ring stay lit up for 5 seconds after the last command was completed. As long as you start the new request within that time, you wont need to say "Alexa" again.
OK, it is simple really but surprising it has taken this long before they got round to it.
It is actually simple. It works for standalone commands, but hasn't got the hang of nested commands yet. So you can say "Alexa what's the weather in Edinburgh. What's the weather in London" but you can't say "What's the weather in Edinburgh...and in London?"
The Verge

On that note. I've been Daniela Young.
And I'm still Adam Christie. Thanks for listening and remember to subscribe if you want to catch every episode.