This article is more than 1 year old

Robo-callers, robo-cops, robo-runners, robo-car crashes, and more

Find out more about Google Duplex, the Atlas robot, and what caused Uber's deadly accident

Roundup Here's a summary of this week's AI news, beyond what we've already covered.

Oh no, not another robo caller The internet has been flooded with people raising questions about Google Duplex, an AI system that can supposedly make customer service calls on behalf of its human user.

CEO Sundar Pichai announced the new feature on stage at this year’s Google I/O developer conference. He even played a snippet of it trying to book a table at a restaurant. It’s weird. The voice isn’t bad, the dialogue isn’t bad either. But, there’s something about it that’s slightly off. There are weird pauses in the conversation, and its ‘umms’ and ‘ahhs’ make the whole thing uncanny.

Google Duplex is a recurrent neural network. The spoken words are processed through an automatic speech recognition system and some magical natural language understanding stuff is done and then a response text is composed and converted into speech.

The model apparently takes features from the audio, the history of the conversation, the parameters of the conversation (e.g. the desired service for an appointment, or the current time of day) and more,” according to Google’s blog post.

“We trained our understanding model separately for each task, but leveraged the shared corpus across tasks.” So, it sounds like a different model is used for a call to book an appointment for a hairdresser or a restaurant – the only two examples Google has used so far. It only works in very narrow cases.

The technology probably isn’t very advanced or good enough to be used yet. The clip played at Google IO was a recording, and although CNET was given a sneak peak of the technology, Google also refused to do a live demonstration.

Here are some good questions. Some have been answered and others have not.

  • Will the person know that the caller is a robot?
  • Apparently, yes – sometimes.
  • What kind of data did Google use to train it? How did it anonymize the phone calls?
  • How does Google Duplex achieve perfect understanding to carry a phone call?
  • It doesn’t – there will be a human handler to fall back on when the conversation fails – like Facebook’s dead M assistant.
  • How many humans will Google employ to pick up the pieces when Google Duplex goes awry?
  • How will it deal with accents and dialects?
  • Will it be rolled out in different languages?
  • Why would anyone use this?
  • How long until all calls will just be two Google Duplex’s speaking to one another?

One more AI Google thing All of Google’s various research groups, including Google Brain, have now been united under Google AI.

Face-recognition software falls flat for flatfoots Cops in Wales, UK, have admitted that their Automated Facial Recognition (AFR) technology – a tool that matches faces caught on camera against a database of known crooks and troublemakers – suffers from false positives.

While scanning Brits during the UEFA Champions League week in Cardiff in 2017, the AFR chalked up 2,297 false positives versus 173, ie: it collared a load of people who weren't in the database. The police insist these are just alerts for officers who, once they've found themselves over to the person, can confirm the identity of the suspected suspect before taking any action.

"Since we introduced the facial recognition technology no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained," the force said.

The high-rate of false positives was blamed on:

  • Poor quality of images made up the watch list which were supplied by UEFA and other partner agencies
  • Previous algorithm from [tech supplier] NEC which has since been updated and is more accurate reducing false positives and collateral intrusion
  • First major deployment

Robo-cop, it ain't.

DeepMind + Android = ? DeepMind is collaborating with Google Android to develop two new mobile phone features: Adaptive Battery and Adaptive Brightness.

Adaptive Battery uses machine learning to “anticipate which apps you’ll need next” to save your battery life. Adaptive brightness will use algorithms to change the brightness of your mobile phone based on your surroundings, according to a blog post.

It’ll be available for Android users running the Android P operating system. It’s all still pretty new so DeepMind hasn’t revealed much about how it all works, and did not really give much away when we asked further questions.

Although both subsidiaries fall under the overarching Alphabet brand, they don’t work together often. Some of DeepMind’s work like WaveNet has been included in some of Google’s text-to-speech stuff like in Google Duplex. It’s nice to see stuff from DeepMind that is practically useful and not just hype.

Uber’s deadly software mistake Uber’s deadly self-driving car crash, in which an autonomous vehicle ran over a woman crossing the street at night, was caused by software issues.

The details aren’t entirely clear. But it seems like the software that is used to process incoming data from sensors to recognize objects was tampered with by Uber’s engineers. The sensitivity was dialed down to reduce “false positives,” aka objects that the car shouldn’t stop for, like plastic bags. Effectively, Uber's techies were fed up with the computer-controlled vehicles stopping for trash and other minor things in the road, so dialed down the AI to ignore them.

Unfortunately, that meant that the car failed to recognise a woman pushing her bike across the road, and slammed into her, according to The Information.

Software glitches are pretty common, check out the California’s recent DMV report on some of the issues reported by autonomous car companies.

Run, the robots are coming Boston Dynamics released its latest 30-second teaser showing off its Atlas robot running along some houses.

Atlas is looks pretty impressive. It’s a humanoid droid complete with all the main human body parts: a head, two arms, a torso, and two legs. Previous videos showed that it could walk across a room and open a door.

Youtube Video

Now, it can run. Loud mechanical bleeps can be heard as Atlas goes for a jog on grass. Everyone likes to speculate and entertain the idea of a robot uprising with each Boston Dynamics robot. But it’s not really clear how the robot works, or how well it performs. ®

More about

TIP US OFF

Send us news


Other stories you might like