Microsoft’s Artificial Intelligence Meltdown



The Turing Test: Microsoft’s artificial intelligence meltdown

Subscribe to journeyman for daily uploads:

For downloads and more information visit:

Like us on Facebook:

Follow us on Twitter:

Follow us on Instagram:

Computer scientist Alan Turing created a conversation test for robots that has been a benchmark for over sixty years. But as humanoid robots develop every day, can any yet fool us into thinking they are human?

From robots like Atlas at Boston Dynamics, built to withstand physical bullying, to the uncanny silicon and hair humanoids developed at Hanson robotics, we take a look at some of the most advanced androids ever created. Yet imitating simple conversation can still be a test too far, as Microsoft discovered after the widely publicised failure of their Twitter robot, Tay. Alistair Charlton, senior tech reporter for the International Business Times, argues that the industry has more successful options to pursue. “Tay has much more to offer than a Twitter account”, he says.

ITN Productions – Ref. 6736

Journeyman Pictures is your independent source for the world’s most powerful films, exploring the burning issues of today. We represent stories from the world’s top producers, with brand new content coming in all the time. On our channel you’ll find outstanding and controversial journalism covering any global subject you can imagine wanting to know about.

27 Comments

Leave a Reply
  1. Meh. By the time they convincingly look like us and act like us, we'll be so much like robots we won't care anymore. Just as AI is changing to be more like us, we're changing to be more like it. I don't think this is intentional. Same thing happens with most of our technology. This happens over long periods of time. It might be more apparent to older people or disadvantaged people.

  2. An AI is basically an ultimate child, without the physical limitations imposed by biological bodies to hamper it's ability to self learn. Problem is, it's no different from a human child in the sense that it's only as good as the data it's fed with. Tay was simply the result of insane Nazis being the over represented loudmouthed minority flooding it's input data stream.

Leave a Reply

...