Mind-blowing demonstration of full-duplex conversation by Google Assistant
At Google I/O today Google just demonstrated how their Google Assistant can not only make a phone call for you (that's old news) but also then have a natural language conversation on your behalf. So now you can ask it to book a restaurant table and it will call the restaurant, wait for an answer and carry on a natural language conversation with the restaurant to book the table.
Mind blown!
Watch some demos here yourself:
So now I'm thinking why don't they provide Google Assistant for businesses to answer the phone? Who needs to waste staff time answering the phone to tell people your business hours or take reservations? Which begs the question - what happens when Google Assistant calls for a customer and Google Assitant answers for a business?
Well then we have computers just talking to each other and at that point is there any point in having speech at all? Google Assistant can just figure out it is a talking to a computer and say "Oh hi computer, let's switch to talking data, okay?" and then they can squawk some JSON requests back and forth (or whatever).
The question is can Google Assistant figure out when it is not talking to a human? After all isn't the goal of Google Assistant making calls for you that the other party doesn't actually know it is talking to a computer? That is it can pass the Turing Test? So if Google Assistant can tell that Google Assistant is a computer then it has failed the Turing test. Also I wonder how many people would fail the Turing test inadvertently?
In reality, I'm sure computers would agree some standard "tell" such as squawk some sub-frequency tone to each other to voluntarily give that information, but it's a fun thought experiment to think about IMO.