Editorials

AI is cool and all, but maybe get it right?

It’s 2023. You can’t buy an ICEE without an AI in it. Every device is touting AI features, lead on this side of the phone divide by Google/Alphabet.

Google Home still occasionally decides to take a routine that’s written and return a different result. Google support still starts with asking you to factory reset any device and burn your home to the ground before ever admitting that all of this is done on their side and nothing you do with a smart speaker has any effect.

I’ve got the Pixel 7 Pro, and it’s got a chip that was advertised as using on-board Ai for personalized speech recognition. When I got the phone it had the dumbs, and within a week it’s pretty much got my speech down. Unfortunately for my use cases there’s another step that’s just not there and that’s in how the apps I’m in are handling speech.

So what it appears to be is AI on Google’s Tensor G2 chip works to translate my request – yay. It does extremely well now. Significantly better than a much faster feeling and older phone of mine did. The AI has learned my speech, but the back end integration is missing.

I listen to a lot of news podcasts. This means I’m skipping advertising quite often, skipping items I’ve heard before that are being offered up, etc. Saying “OK Google, skip ahead 20 seconds” is a fairly common thing and… unfortunately, not handled on device. There’s a text command, it goes out into the great unknown, gets processed by Google Assistant on the cloud (I’m assuming,) comes back with what needs to happen on your device, and triggers an action. We’re processing off the phone unless I’m missing it.

I say this because the text is recognized, and the results for skipping ahead 20 seconds from when I asked we’re looking at from 5 seconds to infinity before the command is ever actually executed. As far as I can tell it’s doing what I mentioned above. Going somewhere else, dependent on cloud services that don’t need to be involved, and probably yet another AI. The result, when driving and in areas with different bandwidth, is inconsistencies in execution time.

Also Google’s steadfast refusal to log anything that could be of use to a consumer attempting to debug a problem is driving me nuts.

It seems like if you’ve got an onboard AI… execute on board… follow up with a log to a server somewhere if you want but wasn’t the idea to put more on the phone in terms of AI than in the cloud? That seemed to be the entire push of the Tensor chips for a bit… or maybe that was just marketing.

On the other end of our future AI overlords, maybe ChatGPT could stop writing “in conclusion” on any article it writes. Just sayin…

Pocketables does not accept targeted advertising, phony guest posts, paid reviews, etc. Help us keep this way with support on Patreon!
Become a patron at Patreon!

Paul E King

Paul King started with GoodAndEVO in 2011, which merged with Pocketables, and as of 2018 he's evidently the owner. He lives in Nashville, works at a film production company, is married with two kids. Facebook | Twitter | Donate | More posts by Paul | Subscribe to Paul's posts

Avatar of Paul E King