It has been suggested that human-robot interaction is likely to be facilitated by human-like facial expressions - but will people feel the robots are expressing real emotions? If do not, emotional reactions are likely to hinder appropriate interactions. Thus, for human beings observing avatar smiles that do not “feel” real, this will most likely impact on achieving the objectives for which the avatar was employed. My initial analysis shows that majority of avatars were faking it, only one of eight avatars were seen to be really smiling, and even then less plausible than most real human smiles . This project aims to extend the analysis by collecting physiological data when watching avatars’ facial expressions.
- Finding relevant features to smiles from the physiological signals.
- Developing a method for distinguishing between real and posed smiles.
- Experience in advance machine learning and prediction.
- Knowledge in basic machine learning and statistics.
- Basic knowledge in physiological signals.
- MZ Hossain and T Gedeon (2017). Discriminating real and posed smiles: human and avatars smiles. 29th Australian Conference on Computer-Human Interaction, pp. 581-586.