"Talking to Machines"
What can machines tell us about being human? This hour of Radiolab, Jad and Robert meet humans and robots who are trying to connect, and blur the line.
Episode One of Season Ten

"Talking to Machines"

What can machines tell us about being human? This hour of Radiolab, Jad and Robert meet humans and robots who are trying to connect, and blur the line.

Episode One of Season Ten

What can take scientists years to figure out can take a computer just one day — and so researchers at Cornell University have developed an evolutionary computing algorithm called Eureqa that allows the laws of nature to be extracted from data at unheard of rates. This is a huge departure from the normal scientific method, in which a hypothesis is proposed to explain an existing observation. With Eureqa, scientists feed a computer system pure data, and it infers laws and relationships from there; in one notable instance, the computer independently discovered the law of conservation of energy. 

 ”Simon" is a robot at the Georgia Institute of Technology that’s involved in a series of projects designed to look at the interaction between robots and humans. Recently, researchers have found that they can program Simon to understand when it has a human’s attention, and when it doesn’t, with nearly 80 percent accuracy.
Basically, Simon can tell if a person is ignoring him (no indication, incidentally, of whether or not this bothers him). This form of social intelligence will be crucial as we continue on our path to robot-human cohabitation. Imagine a world full of oblivious robots, performing duties that humans don’t notice! It’s both sad and terrifying. 
Aaron Bobick, professor and chair of the School of Interactive Computing at Georgia Tech: “Other human beings understand turn-taking. They understand that if I make some indication, they’ll turn and face someone when they want to engage with them and they won’t when they don’t want to engage with them. In order for these robots to work with us effectively, they have to obey these same kinds of social conventions, which means they have to perceive the same thing humans perceive in determining how to abide by those conventions.”

 Simon" is a robot at the Georgia Institute of Technology that’s involved in a series of projects designed to look at the interaction between robots and humans. Recently, researchers have found that they can program Simon to understand when it has a human’s attention, and when it doesn’t, with nearly 80 percent accuracy.

Basically, Simon can tell if a person is ignoring him (no indication, incidentally, of whether or not this bothers him). This form of social intelligence will be crucial as we continue on our path to robot-human cohabitation. Imagine a world full of oblivious robots, performing duties that humans don’t notice! It’s both sad and terrifying. 

Aaron Bobick, professor and chair of the School of Interactive Computing at Georgia Tech: “Other human beings understand turn-taking. They understand that if I make some indication, they’ll turn and face someone when they want to engage with them and they won’t when they don’t want to engage with them. In order for these robots to work with us effectively, they have to obey these same kinds of social conventions, which means they have to perceive the same thing humans perceive in determining how to abide by those conventions.”

(via caseypugh)

Like the astronaut phoning in from outer space, we too can follow along online!

Q: What is RoboEarth?
A: At its core, RoboEarth is a World Wide Web for robots.
“M2M. What is it? Simply put, it’s machines rather than people connecting to the Internet ~ in fact there are already more non-human users than human users on both the AT&T and Verizon networks.” 
 Renee Oricchio INC.com

M2M. What is it? Simply put, it’s machines rather than people connecting to the Internet ~ in fact there are already more non-human users than human users on both the AT&T and Verizon networks.” 

 Renee Oricchio INC.com

NELL (Never Ending Language Learner) is a computer system at Carnegie-Mellon University that uses the Internet to teach itself English. It reads the Web 24 hours a day, seven days a week, learning language like a human would — cumulatively, over a long period of time. It parses text on the Internet for ontological categories, like “plants,” “music” and “sports teams,” then uses contextual clues to sort out what things belong in which categories, like “Nirvana is a grunge band” and “Peyton Manning plays for the Indianapolis Colts.”
Follow NELL’s self-taught discoveries via Twitter.

NELL (Never Ending Language Learner) is a computer system at Carnegie-Mellon University that uses the Internet to teach itself English. It reads the Web 24 hours a day, seven days a week, learning language like a human would — cumulatively, over a long period of time. It parses text on the Internet for ontological categories, like “plants,” “music” and “sports teams,” then uses contextual clues to sort out what things belong in which categories, like “Nirvana is a grunge band” and “Peyton Manning plays for the Indianapolis Colts.”

Follow NELL’s self-taught discoveries via Twitter.

Anybots are $15,000 telepresence robots — personal avatars, if you will — that can glide around the place, remotely chatting with people and attending conferences for their navigators. Here, a scene from our not-too-distant future: a robot ordering a scone in a coffee shop in Mountain View, CA, being cute with a barista, a bow-tie and tasseled Tibetan tote bag strung to its mechanical neck. Men in line watch on with looks of obvious bemusement. It toddles off down the street. The world keeps on turning.