When you think of robots, what comes to mind? Do you think of small, mechanical robots, like the kind you can build with various hobby kits? Perhaps you think of more sophisticated robots like the NASA Mars rovers or those used by the U.S. military for bomb disposal. Science fiction envisions the most sophisticated robots and androids that look like machines, such as Robbie the Robot, from Forbidden Planet (1956); Gort, from The Day the Earth Stood Still (1951); and R2D2 and C3PO from Star Wars (1977). More interestingly, science fiction also offers us robots that look and act like humans, such as Data, from Star Trek: The Next Generation (1987); Bishop from Aliens(1986); or the androids of Blade Runner (1982). While many humanoid robots populate the world of science fiction, are they anywhere close to a reality yet?
As the science of robotics advances, researchers are trying to build robots with more noticeably human traits, from their behavior to their appearance. This build-a-synthetic-human approach is done in part because we want humans and robots to be able to interact easily. For example, it would be easier to give a robot a voice command rather than have to type in a command on a keyboard. Similarly, a robot may be able to interpret our intentions if it can just read human body language.
Which human traits might we put into a robot, and what are the current realities? That's what we will examine in this article. Let's begin with touch.
10: I Need a Robot With a Soft Touch
Robotic hands use motors to supply the force required to grip an object or tool. But can a robot grip a delicate object like a potato chip or a butterfly's wing without damaging it? Robotic hands are not yet sensitive enough. However, Dr. Lei Zhai, of the University of Central Florida, is using an aerogel as a sensor (see sidebar). Zhai and his colleagues developed an aerogel made of carbon nanotubes. The nanotubes were teased apart in order to make it, so the aerogel is mostly air. However, the substance retains its strength, flexibility and conductivity -- it will provide a strong, wide, yet flexible area of contact (touch). The conductivity allows the gel to be a good electrical sensor. So, the result is that the scientists have a large area sensor that will enable a robotic hand to detect small changes in pressure [source: Danigellis].
Next, let's see how robots move.
9: Robots Should Move Like People
If you reach for a pencil, your hand and arm make a smooth movement. You don't even think about it. To coordinate that movement, many sensory neurons in your shoulder, arm and hand send information to your cerebellum, which processes the information and sends commands to motor neurons in the muscles of those limbs. This continuous interplay allows for the smooth movement you experience. Otherwise, you would have jerky movements, like patients with chorea, Parkinson's disease or cerebellar disorders. Many robots have these same jerky movements because it's difficult to program the complexity of movement.
Recently, at Georgia Tech, Dr. Andrea Thomaz and her student Michael Gielniak filmed human movements using motion-capture technology. They analyzed the captured images and figured out how to program robots to have smoother, more human-like movements [source: Aid].
Along a similar line, NASA and General Motors have developed "robonauts" that have human-like hands and arms for using tools. They can work alongside astronauts in outer space and on other planetary missions. They are controlled by virtual reality technology and can do work in places that are too dangerous for humans.
Next, let's take a look at female robots.
8: Designing Women
Psychologically, most people respond better to women than to men. So, why not make female robots? In the movie The Stepford Wives (1975), husbands in an elite community replace their wives with duplicate robotic counterparts. Can we build a robot with female qualities? WowWee, the creator of the toy robot Robosapien, has developed a female robot called Femisapien. She has a definite female shape and can sing, dance and blow kisses.
Robotic scientists in Japan, meanwhile, have created two sophisticated female robots of their own. Professor Hiroshi Ishiguro, of Osaka University, built Geminoid-F, a robot modeled after a woman in her 20s. Geminoid F is remotely operated, and she can mimic human facial expressions as well as talk. She has participated in news broadcasts and even a play [source: Dillow].
Similarly, scientists from Japan's National Institute of Advanced Industrial Science and Technology, along with Yamaha, created a dancing female robot called HRP-4C. She can walk, move her arms, make facial expressions, sing and dance [source: Katz].
We've examined touch, movement and robot femininity. Now let's take a closer look at robot faces.
7: A Look Can Say It All
About half of human communication occurs without speaking. Humans use body language and facial expressions to communicate emotions and information. So, for robots and humans to interact effectively, robots should be able to read human facial expressions as well as produce them.
Many laboratories around the world are working on developing robots that can create facial expressions. We have already mentioned that the female robots Geminoid-F and HRP-4C can make facial expressions. Other groups work with robotic heads or upper bodies. Researchers at Waseda University in Tokyo have built what they term an Emotion Expression Humanoid Robot with the designation WE-4RII. The robot consists of a head with a face, eyes and mouth that can be manipulated with wires and motors. It also has arms, shoulders and a body that allow it to express emotions with its arms. The range of emotions it can express includes happiness, fear, surprise, anger, sadness and even disgust [source: Waseda University].
Now, let's move ahead to see if robots can respond to language.
6: Talk to Me
Many robots interact with humans via computer terminals or hand-held devices. But people use language to communicate, not keystrokes. So, for better human-robot interactions, robots should be able to understand language. In Star Wars, C3PO was "fluent in over 6 million forms of communication," from binary to Ssi-ruuvi, the alien language of the Ssi-ruuk. While the fictional C3PO pulled it off with the help of writers, what is involved in truly getting robots to understand human language?
According to Nicolas Roy's laboratory at MIT, a robot must have the following:
- A speech recognition system that receives and sequences the words spoken to it
- A dialog management system that infers the intent of a user's command
- The robotic control to be able to respond to a user's commands
Roy's lab has developed a robotic micro-helicopter, a robotic forklift and a robotic wheelchair that can follow spoken human directions.
Besides responding to human commands, researchers at Japan's Waseda University, meanwhile, are developing a robot that can mimic human speech. The robot, called WT-4, can talk, hear and imitate sounds. It even has vocal cords, a tongue, a soft palate, a nose and lips.
Next up -- can robots express emotions?
5: Giving Robots Emotions
As we have seen, most robotics research is aimed at improving the interactions between humans and robots. By nature, humans are emotional creatures, but can robots also experience emotions? One emotion being researched for robotic application is empathy. Humans can empathize with one another, but in order to empathize, you need to know or feel the state of mind of another person's mind -- to put yourself in his or her place. Can a robot empathize with a human?
Dr. Hideki Kozima, of Kyoto University in Japan, is working on a project called Carebots that has developed two robots, Infanoid and Keepon, that can interact with their human caregivers. Kozima uses the robots to model and understand how social communication occurs in humans and primates. Like infants, the robots convey emotions by eye contact and by joint attention -- where both subjects look at some event and then at each other. Infanoid is an upper-body robot whose head has a face, eyes and lips. The robot analyzes human sounds, but cannot understand language. Kozima studies the interaction of the robots with researchers and with infant children, and he is developing a model of robot empathy [sources: The Carebots Project, Kozima].
Next, because everyone likes to laugh, let's find out if a robot can tell a joke.
4: A Robot, a Pastry Chef and a Veterinarian Walk into a Bar ...
Another common human emotion is humor. We tell jokes and laugh often. But can a robot tell a joke or understand one? In Star Trek: The Next Generation, Commander Data had a hard time understanding humor. He even programmed a "holodeck" stand-up comic to help him. Well, to study human-robot interactions, a Carnegie Mellon University (CMU) graduate student has programmed a robot to do stand-up comedy.
Heather Knight runs the Marilyn Monrobot Labs in New York and is a graduate student at CMU in Pittsburgh, Pa. She has programmed a Nao robot (created by Aldebaran Robotics) for stand-up comedy. The robot has a stored bank of jokes. It selects and tells a joke and then receives feedback from the audience through a camera and WiFi setup. The audience members shows differently colored panels based on whether or not they like the joke. The robot also evaluates the noise level of the room (such as the amount of laughter). If the audience likes the joke, then the robot selects another one along the same lines. If the audience does not like the joke, then it selects a different one. To watch a video of the robot's standup, visit the source link [source: Robaid].
Now that we know a robot can work a crowd, let's see if one can think.
3: A Thinking Robot
Right now, robots do pretty much what they're programmed to do. However, if robots are to actively help humans, and become more like them, they must be able to think on their own. Enter the science of cognitive robotics. This field of robotics strives to give robots the ability to learn and reason in a complex, changing environment. These types of robots will be able to think on their own and adapt to changing conditions in order to complete a task. They will be useful for assisting people with disabilities, astronauts and others.
To this end, robotic scientist Hod Lipson, of Cornell University, has developed robots that are able to teach themselves. One robot that resembles a starfish has taught itself to walk. Even when "injured" (by having one of its legs removed) this starfish robot learns to walk by itself. In simulated computer programs and hardware robots, Lipson has demonstrated how robots can evolve and replicate. Check out the source link to see a video of his robots [source: TED].
Next, let's find out if robots are moral.
2: Can a robot tell right from wrong?
In the movie, I, Robot (2004), the lead character, Detective Spooner is trapped in a flooding car in a river. Another car next to his is also flooding and has a child inside. A robot comes to the rescue and must make a choice. Which person should it save, the older detective or the little girl? The robot decides to save Spooner instead of the child. This incident causes Spooner to resent robots.
This example demonstrates the kind of complex moral situations that might occur as we develop robots that are increasingly autonomous. Can they make moral or ethical decisions? If not, what would prevent a robot from committing murder? This type of scenario was envisioned by science fiction author Isaac Asimov. In his writings, he devised several Laws of Robotics to deal with such issues (see sidebar). However, as autonomous robots are used in war, other ethical issues arise. Does it know to kill the enemy but not an innocent civilian or a surrendering enemy? Does it follow the Geneva conventions? With such thorny issues in mind, it is important that robots learn right from wrong.
Not much has been done in addressing the ethics of robots, mainly because robots are not truly autonomous yet. However, bioethics philosopher Wendell Wallach of Yale University has raised several questions and has suggested approaches by which machines could have morals [source: Wallach].
Finally, in our study of human traits we'd like to see in future robots, let's examine whether or not they can be self-aware.
1: I think, Therefore, I am
Robots are becoming increasingly more complex. A major goal is to make a fully autonomous robot. A key feature inherent in such a goal is the cognitive process known as self-awareness. A fully autonomous robot should be able to recognize itself apart from other robots, objects or people. But how do you define self-awareness? Opinions vary.
One criterion for defining self-awareness is the ability to recognize one's own image in a mirror or photo. Drs. Kevin Gold and Brian Scassellati, from Yale University, have developed a self-awareness algorithm in a robot named Nico -- the robot can analyze and recognize self-motion in a mirror, thereby recognizing itself. Similarly, Dr. Junichi Takeno, at Meiji University in Japan, built a robot called Egobot that can also recognize itself in a mirror.
Those seem like great developments. However, Drs. Rony Novianto and Mary Anne Wiliams, at the University of Technology in Sydney, Australia, argue that the "mirror test" is not a reliable indicator of self-awareness. They point out that mammals are often accepted as self-aware animals but only a few of them (humans, primates, dolphins and elephants) can recognize themselves in a mirror. The doctors have instead come up with a framework to develop self-awareness in a robot based on the ability to attend to its own internal state. So, robot scientists continue to work on defining self-awareness and developing robot software to meet its needs.
Lots More Information
Related Articles
- Are robots intelligent?
- Do robots have rights?
- Are robots alive?
- Is artificial intelligence dangerous?
- How does the Nao robot help autistic children?
Sources
- Aid, R. "Robot in the wild project lead to robotic stand-up comedian named Data." Jan 23, 2011. http://www.robaid.com/robotics/robot-in-the-wild-project-lead-to-robotic-stand-up-comedian-named-data.htm
- Aid, R. "Teaching robots to move more human-like." Mar 8, 2011. http://www.robaid.com/robotics/teaching-robots-to-move-more-human-like.htm
- Berns, K. and J. Hirth. "Control of facial expressions in the humanoid robot head." ROMAN, IEEE/RSJ International Conference on Intelligent Robots and Systems. 2006. pp. 3119-3124. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.159.2832&rep=rep1&type=pdf
- Carnegie Science Center. Robot Hall of Fame. http://www.robothalloffame.org/index.html
- Choi, C.Q. "Automaton, Know Thyself: Robots Become Self-Aware." Scientific American.com. Feb. 24, 2011. https://www.scientificamerican.com/article.cfm?id=automaton-robots-become-self-aware (May 2, 2011)
- Choi, C.Q. "Not Tonight, Dear, I Have to Reboot!" Scientific American. March 2008. http://www.scientificamerican.com/article.cfm?id=not-tonight-dear-i-have-to-reboot (May 3, 2011)
- Dillow, C. "Japanese Geminoid F Bot Realistically Mimics Human Facial Expressions, Speech." Sci.com. Apr 5, 2010. http://www.popsci.com/technology/article/2010-04/remotely-operated-geminoid-f-bot-realistically-mimics-facial-expressions-speech
- Doshi, F. and N. Roy. "Spoken Language Interaction with Model Uncertainty: an Adaptive Human-Robot Interaction System." Connection Science, 20. pp. 299-318. 2008. http://people.csail.mit.edu/finale/papers/connection_science_08.pdf
- Fong, T. et al. "A Survey of Socially Interactive Robots, Robotics and Autonomous Systems 42 (2003)." pp. 143–166. http://www.societyofrobots.com/robottheory/Survey_of_Socially_Interactive_Robots.pdf
- Fellous, J.M., M.A. Arbib (eds.). "Who Needs Emotions? The Brain Meets the Robot." Oxford University Press. New York. 2005. p. 399. http://newplans.net/RDB/Who%20Needs%20Emotions%20The%20Brain%20Meets%20the%20Robot%20-%20Fellous%20&%20Arbib.pdf (May 2, 2011)
- Fukui, K, et al. "A Robot that Mimics Human Speech." ASA/CAA '05 Meeting. Vancouver, BC. 2005. http://www.acoustics.org/press/149th/kotaro.htm
- Fussell, S.R. et al. "How People Anthropomorphize Robots." HRI'08. March 12–15, 2008. Amsterdam, Netherlands. http://sfussell.hci.cornell.edu/pubs/Manuscripts/Fussell-HRI08.pdf
- Gaudin, S. "Scientists Build a Robot that Can Learn Emotions." Computerworld. Aug 10, 2010. http://www.computerworld.com/s/article/9180781/Scientists_build_a_robot_that_can_learn_emotion s (May 2, 2011)
- Gielniak, M.J., et al. "Secondary Action in Robot Motion." Proceedings of the 19th IEEE International Symposium on Robot and Human Interactive Communication. 2010. http://www.cc.gatech.edu/social-machines/papers/gielniak10_roman_secondary.pdf
- Guizzo, E. "Meet Feminoid-F, a Smiling Female Android." Automoton. April 30, 2010. http://spectrum.ieee.org/automaton/robotics/humanoids/040310-geminoid-f-hiroshi-ishiguro-unveils-new-smiling-female-android
- Guterl, F. "Do It Yourself, Robot." Newsweek. November 27, 2006. http://www.mae.cornell.edu/lipson/NewsWeek06_Lipson.pdf
- "Hod Lipson Builds 'Self-Aware' Robots." TED video. March 2007. http://www.ted.com/talks/hod_lipson_builds_self_aware_robots.html
- Knight, H., et al. "A Savvy Robot Standup Comic: Online Learning through Audience Tracking." TEI, 2011. Portugal. http://www.marilynmonrobot.com/wp-content/uploads/2009/05/tei2010-standuprobot.pdf
- Kollar, T., et al. "Toward Understanding Natural Language Directions." Human-Robot Interaction. 2010. http://tkollar.csail.mit.edu/TK/Publications/kollar10.pdf
- Kozima, H., et al. "Can a Robot Empathize With People?" Artif Life Robotics Chapter 8. pp. 83–88. 2004. http://www2.deec.uc.pt/~jorge/in_out/UMA_Course/2009/8.article.PDF
- Kozima, H. and H. Yano. "A Robot That Learns to Communicate With Human Caregivers." Proceedings of the First International Workshop on Epigenetic Robotics. 2001. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.18.2393&rep=rep1&type=pdf
- Miwa, H et al. "Effective Emotional Expressions with Emotion Expression Humanoid Robot WE-4RII." IEEE/RSJ International Conference on Intelligent Robots and Systems. 200. pp. 2203-2208. http://www.robocasa.net/people/zecca/2004/iros2004_miwa.pdf
- Moshkina, L. and R.C. Arkin. "Beyond Humanoid Emotions: Incorporating Traits, Attitudes and Moods." Georgia Institute of Technology. http://www.cc.gatech.edu/ai/robot-lab/online-publications/icra09MoshkinaArkin.pdf
- NASA. "NASA Developing Robots with Human Traits." http://www.nasa.gov/vision/universe/roboticexplorers/robots_human_coop.html (April 26, 2011)
- Naval Research Lab. "Human-Robot Interaction (HRI) and Cognitive Robotics." http://www.nrl.navy.mil/aic/iss/aas/CognitiveRobots.php
- Novianto, R. and M.A. Williams. "The Role of Attention in Robot Self-Awareness." The 18th IEEE International Symposium on Robot and Human Interactive Communication. Toyama, Japan. September 27-October 2, 2009. http://www.ronynovianto.com/publications/the_role_of_attention_in_robot_self-awareness.pdf
- Physorg.com. "Japan Unveils Humanoid Robot That Laughs and Smiles." http://www.physorg.com/news189528493.html (May 3, 2011)
- Reilly, M. "Sharing a Joke Could Help Man and Robot Interact." New Scientist Tech. August 1, 2007. http://secs.ceas.uc.edu/~mazlack/academic.UC/InTheNews.Julia/NewScientistTech.Aug1.pdf (May 2, 2011)
- Sullins, J.P. "When is a Robot a Moral Agent?" International Review of Information Ethics. December 2006. http://www.i-r-i-e.net/inhalt/006/006_Sullins.pdf.
- Tapus, A., et al. "The Grand Challenges in Socially Assistive Robots." Robots and Automation Magazine. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.127.5625&rep=rep1&type=pdf
- University of Hertfordshire. "Robots That Develop Emotions in Interactions With Humans." http://www.herts.ac.uk/news-and-events/latest-news/Robots-That-Develop-Emotions-in-Interaction-with-Humans.cfm (May 2, 2011)
- University of Plymouth. "IBL: Instruction-based Learning for Mobile Robots." http://www.tech.plym.ac.uk/soc/staff/guidbugm/ibl/index.html
- Wallach, W. "Robot Minds and Human Ethics: The Need for a Comprehensive Model of Moral Decision Making." Ethics Inf Technol. 2010. Chapter 12. pp. 243–250. http://commonsenseatheism.com/wp-content/uploads/2011/02/Wallach-Robot-minds-and-human-ethics.pdf (May 2, 2011)
- Wallach, W. "The Challenge of Moral Machines." Philosophy Now. 2009. http://www.philosophynow.org/issue72/The_Challenge_of_Moral_Machines
- Zyga, L. "Robot Demonstrates Self-Awareness." Inventors Spot. http://inventorspot.com/robot_demonstrates_self_awareness



































Comments ( )