Gaze-follow framework under natural interaction conditions for non-human apes
Gaze, thus looking at a specific object or person, plays a vital role in the coordination of interaction and movements in collectives. Gaze-follow studies in computer vision usually are devoted to creating algorithms that can determine where a person in a scene is looking. While there have been many studies on humans, very few have focused on non-human animals, notably our closest relatives, the great apes. The main challenges in this field are the lack of high-quality datasets on (and the quantification of) gaze and movements during naturalistic interactions for non-human apes and the difficulty of dealing with their varying head and eye shapes. This interdisciplinary project aims to address these challenges by collecting and analyzing a new dataset and developing a novel deep learning model to estimate the gaze position of non-human apes under natural interaction conditions. The proposed algorithm will assist researchers studying non-human apes’ perception during social or individual activities in enclosed or restricted areas. Given the flexibility of the algorithm, it can be extended to other primate species (and possibly non-primate species with frontally placed eyes) in the future, providing an exciting new method to the non-invasive study of gaze in animal interactions.
This study also aims to propose a new model for estimating the gaze location of non-human apes. Unlike previous models that focused on humans, this model will be trained as a foundation gaze-follow model of bonobos, chimpanzees, and orangutan.