For the second year in a row the National Science Foundation (NSF) has chosen to feature research and innovations from the Polytechnic Institute of New York University (NYU-Poly) Mechanical and Aerospace Engineering Department in its USA Science and Engineering Festival. This year it’s an app for iOS and Android that controls robots via voice activation and other simple user interface to control robotic functions.
Here’s how the Science and Engineering Festival describes the display:
“Learn about advances in human-robot interaction technology through engaging and educational projects, such as mobile robots, a robotic manipulator, a smart home, and laboratory automation testbeds. Experience innovative uses of pervasive communication and computing devices, such as iPhone and iPad, to intuitively interact with physical objects by exploiting on-board sensors, rich graphics, touch, gesture, and sound recognition capabilities of iPhone and iPad. Human-robot interaction will feature a robotic manipulator following movements of a human-operator wearing a smart jacket.”
The technology research effort was funded by the NSF and is geared toward providing technology solutions for people with disabilities.
Vikram Kapila, NYU-Poly professor who leads the app project, explains:
“If robots are to be pervasive, they will have to be easy to use.”
“The iPhone and Apple’s other mobile devices are so intuitive that children and adults alike don’t need instructions to use them,”
“Couple that with their built-in sensors, and you have the perfect platform upon which to build interfaces to interact with robots.”
Check out this video on the first USA Science and Engineering Festival:
Here’s a couple images, via Flickr, featuring Vikram’s iLabArm app and iLabBot app: