Skip to main content
Log in

Investigating the Effect of a Humanoid Robot’s Head Position on Imitating Human Emotions

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Humans show their emotions with facial expressions. In this paper, we investigate the effect of a humanoid robot’s head position on imitating human emotions. In an Internet survey through animation, we asked participants to adjust the head position of a robot to express six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. We found that humans expect a robot to look straight down when it is angry or sad, to look straight up when it is surprised or happy, and to look down and to its right when it is afraid. We also found that when a robot is disgusted some humans expect it to look straight to its right and some expect it to look down and to its left. We found that humans expect the robot to use an averted head position for all six emotions. In contrast, other studies have shown approach-oriented (anger and joy) emotions being attributed to direct gaze and avoidance-oriented emotions (fear and sadness) being attributed to averted gaze.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Adams RB, Kleck RE (2003) Perceived gaze direction and the processing of facial displays of emotion. Psychol Sci 14(6):644–647

    Article  Google Scholar 

  2. Adams RB, Kleck RE (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Am Psychol Assoc 5(1):3–11. https://doi.org/10.1037/1528-3542.5.1.3

    Google Scholar 

  3. Aldebaran Robotics (2012) Nao key features. www.aldebaran-robotics.com/en/Discover-NAO/Key-Features/hardware-platform.html. Access 17 July 2012

  4. Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Oxford

    Google Scholar 

  5. Bartneck C, Reichenbach J, Breemen VA (2004) In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. In: Proceedings of the design and emotion. Ankara, Turkey, pp 32–51

  6. Bazo D, Vaidyanathan R, Lentz A, Melhuish C (2010) Design and testing of a hybrid expressive face for a humanoid robot. In: 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 5317–5322

  7. Beck A, Cañamero L, Bard K (2010) Towards an affect space for robots to display emotional body language. In: Ro-man, 2010 IEEE. IEEE, pp 464–469

  8. Becker-Asano C, Hiroshi I (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: 2011 IEEE workshop on affective computational intelligence (WACI), pp 1–8

  9. Bethel CL (2009) Robots without faces: non-verbal social human–robot interaction. University of South Florida

  10. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59:119–155

    Article  Google Scholar 

  11. Cuijpers RH, van der Pol David (2013) Region of eye contact of humanoid Nao robot is similar to that of a human. In: Herrmann G, Pearson MJ, Lenz A, Bremner P, Spiers A, Leonards U (eds) Lecture notes in computer science: vol 8239: social robotics. Springer, Berlin, pp 280–289. https://doi.org/10.1007/978-3-319-02675-6_28

  12. Ekman P (1999) Basic emotions. In: Dalgleish T, Power M (eds) Handbook of cognition and emotion, vol ch. 3. Wiley, Sussex, pp 45–60

    Google Scholar 

  13. Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1:49–98

    Article  Google Scholar 

  14. Ekman P, Friesen WV (1978) Manual for facial action coding system. Consulting Psychologists Press, Palo Alto

    Google Scholar 

  15. Gorostiza JF, Barber R, Khamis AM, Pacheco MMR, Rivas R, Corrales A, Salichs MA (2006) Multimodal human–robot interaction framework for a personal robot. In: The 15th IEEE international symposium on robot and human interactive communication, 2006 (ROMAN 2006). IEEE, pp 39–44

  16. Ham J, Bokhorst R, Cuijpers RH, van der Pol D, Cabibihan JJ (2011) Making robots persuasive: the influence of combining persuasive strategies (gazing and gestures) by a storytelling robot on its persuasive power. In: Social robotics. Springer, Berlin, pp 71–83

  17. Johnson DO, Cuijpers RH, van der Pol D (2013) Imitating human emotions with artificial facial expressions. Int J Soc Robot 5(4):503–513

    Article  Google Scholar 

  18. Johnson DO, Cuijpers RH, Juola JF, Torta E, Simonov M, Frisiello A, Beck C (2014) Socially assistive robots: a comprehensive approach to extending independent living. Int J Soc Robot 6(2):195–211

    Article  Google Scholar 

  19. Johnson DO, Cuijpers RH, Pollmann K, van de Ven AA (2016) Exploring the entertainment value of playing games with a humanoid robot. Int J Soc Robot 8(2):247–269

    Article  Google Scholar 

  20. Jung HW, Seo YH, Ryoo MS, Yang HS (2004) Affective communication system with multimodality for a humanoid robot, AMI. In: 2004 4th IEEE/RAS international conference on humanoid robots, vol 2. IEEE, pp 690–706

  21. Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78

    Article  Google Scholar 

  22. Leite I, Pereira A, Martinho C, Paiva A (2008) Are emotional robots more fun to play with? In: The 17th IEEE international symposium on robot and human interactive communication, 2008 (RO-MAN 2008). IEEE, pp 77–82

  23. Nadel J, Simon M, Canet P, Soussignan R, Blancard P, Canamero L, Gaussier P (2006) Human responses to an expressive robot. In: Proceedings of the sixth international workshop on epigenetic robotics. Lund University cognitive studies, vol 128, pp 79–86

  24. Olson DH, Russell CS, Sprenkle DH (1980) Circumplex model of marital and family systems II: empirical studies and clinical intervention. Adv Fam Interv Assess Theory 1:129–179

    Google Scholar 

  25. Petisca S, Dias J, Paiva A (2015) More social and emotional behaviour may lead to poorer perceptions of a social robot. In: International conference on social robotics. Springer, Cham, pp 522–531

  26. Ribeiro T, Paiva A (2012) The illusion of robotic life: principles and practices of animation for robots. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction. ACM, pp 383–390

  27. Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D (2010) Expressing emotions with the social robot probo. Int J Soc Robot 2(4):377–389

    Article  Google Scholar 

  28. Wu T, Butko NJ, Ruvulo P, Bartlett MS, Movellan JR (2009) Learning to make facial expressions. In: 8th International conference on development and learning, pp 1–6

Download references

Acknowledgements

We would also like to thank Jaap Stelma, Colin Lambrechts, Maurice Houben, Krystian Trninic, and Marijn de Graaf for their contributions to this work.

Funding

This study was funded by the 7th Framework Programme (FP7) for Research and Technological Development (Grant Number 2010-248085).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David O. Johnson.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Johnson, D.O., Cuijpers, R.H. Investigating the Effect of a Humanoid Robot’s Head Position on Imitating Human Emotions. Int J of Soc Robotics 11, 65–74 (2019). https://doi.org/10.1007/s12369-018-0477-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-018-0477-4

Keywords

Navigation