ABSTRACT
Recent advances in computing and robot technology create new opportunities for building robots with increasingly more sophisticated interactivity. One such application is the visual interaction between humans and humanoid in tasks such as mimicking and following. Achieving realistic head-eye motion of the humanoid requires understanding of human kinesiology that dictates the way human coordinate head-eye motion and the ability to control the motion of humanoid to move in the same manner that humans do. In this paper we propose an efficient head-eye motion coordination scheme using an optimization approach - an objective function is formed based on human kinesiology and then optimized for obtaining a realistic head-eye trajectory. The tracking robustness during conversational interaction with a human is further enhanced through a visual feedback scheme, which reduces modelling errors of the humanoid hardware. Experimental results show the tracking efficiency and realism of the motion generated by the proposed scheme with Lilly, a humanoid under development in our lab.
- Singh, S. K.; Pieper, S. D.; Popa, D. O.; Guinness, J., "Control and coordination of head, eyes and facial expressions of virtual actors in virtual environments," Robot and Human Communication, 1993. Proceedings., 2nd IEEE International Workshop on, vol., no., pp. 335--339, 3--5 Nov 1993Google Scholar
- Mori, M, "Bukimi no tani (the uncanny valley)". Energy, 7, pp. 33--35, 1970 (In Japanese see {9} for translation)Google Scholar
- Woods, S.; Dautenhahn, K.; Schulz, J., "The design space of robots: investigating children's views," Robot and Human Interactive Communication, 2004. ROMAN 2004. 13th IEEE International Workshop on, vol., no., pp. 47--52, 20--22 Sept. 2004Google Scholar
- Goetz, J.; Kiesler, S.; Powers, A., "Matching robot appearance and behavior to tasks to improve human-robot cooperation," Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. The 12th IEEE International Workshop on, vol., no., pp. 55--60, 31 Oct.-2 Nov. 2003Google Scholar
- Rajruangrabin, J.; Dang, P.; Popa, D. O.; Lewis, F. L.; Stephanou, H. E., "Simultaneous Visual Tracking and Pose Estimation with Applications to Robotic Actors," World Congress in Computer Science 2008. Proceedings., IPCV 08, pp., July 2008Google Scholar
- Sharkey, P. M.; Murray, D. W.; Heuring, J. J., "On the kinematics of robot heads," Robotics and Automation, IEEE Transactions on, vol. 13, no. 3, pp. 437--442, Jun 1997Google ScholarCross Ref
- Cannata, G.; D'Andrea, M.; Maggiali, M., "Design of a Humanoid Robot Eye: Models and Experiments," Humanoid Robots, 2006 6th IEEE-RAS International Conference on, vol., no., pp. 151--156, 4--6 Dec. 2006Google Scholar
- Giorgio Cannata; Marco Maggiali, "Implementation of Listing's Law for a Tendon Driven Robot Eye," Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on, vol., no., pp. 3940--3945, Oct. 2006Google Scholar
- (2005) Karl F. MacDorman and Takashi Minato CogSci-2005 Workshop: Toward Social Mechanisms of Android Science. {Online}. Available: http://www.androidscience.com/theuncannyvalley/proceedings2005/uncannyvalley.htmlGoogle Scholar
- Jun-Ho Oh; David Hanson; Won-Sup Kim; Young Han; Jung-Yup Kim; Ill-Woo Park, "Design of Android type Humanoid Robot Albert HUBO," Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on, vol., no., pp. 1428--1433, Oct. 2006Google Scholar
- C. Bartneck, T. Kanda, H. Ishiguro, and N. Hagita, "Is the uncanny valley an uncanny cliff?" in Proceedings of the 16 th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2007. IEEE, 2007, pp. 368--373.Google Scholar
- Shimada, M.; Minato, T.; Itakura, S.; Ishiguro, H., "Uncanny Valley of Androids and Its Lateral Inhibition Hypothesis," Robot and Human interactive Communication, 2007. ROMAN 2007. The 16th IEEE International Symposium on, vol., no., pp. 374--379, 26--29 Aug. 2007Google Scholar
- Breazeal, Cynthia and Brian Scassellati "Challenges in Building Robots that Imitate People", in Kerstin Dautenhahn and Chrystopher Nehaniv, eds., Imitation in Animals and Artifacts, MIT Press, 2001 Google ScholarDigital Library
- Scassellati, B., "Investigating models of social development using a humanoid robot," Neural Networks, 2003. Proceedings of the International Joint Conference on, vol. 4, no., pp. 2704--2709 vol. 4, 20--24 July 2003Google Scholar
- Breazeal, C.; Edsinger, A.; Fitzpatrick, P.; Scassellati, B., "Active vision for sociable robots," Systems, Man and Cybernetics, Part A, IEEE Transactions on, vol. 31, no. 5, pp. 443--453, Sep 2001 Google ScholarDigital Library
- Gurbuz, S.; Shimizu, T.; Cheng, G., "Real-time stereo facial feature tracking: mimicking human mouth movement on a humanoid robot head," Humanoid Robots, 2005 5th IEEE-RAS International Conference on, vol., no., pp. 363--368, DecGoogle Scholar
Index Terms
- Realistic and robust head-eye coordination of conversational robot actors in human tracking applications
Recommendations
Inverse Kinematics Based Human Mimicking System using Skeletal Tracking Technology
Humanoid robots needs to have human-like motions and appearance in order to be well-accepted by humans. Mimicking is a fast and user-friendly way to teach them human-like motions. However, direct assignment of observed human motions to robot's joints is ...
Nonverbal communication with a humanoid robot via head gestures
PETRA '15: Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive EnvironmentsSocial interactive robots require sophisticated perception and cognition abilities to behave and interact in a natural human-like way. The proper perception of behavior of interaction partner plays a crucial role in social robotics. The interpretation of ...
A humanoid robot that pretends to listen to route guidance from a human
This paper reports the findings for a humanoid robot that expresses its listening attitude and understanding to humans by effectively using its body properties in a route guidance situation. A human teaches a route to the robot, and the developed robot ...
Comments