Gaze Interactions in Shared Spaces
The project aims at understanding gaze-based interactions among humans and artificial agents. First, we quantify how well humans can recognize other humans’ gaze direction from some distance. Second, we train an artificial system to reach human-like performance in this task. Third, we measure how humans understand each other’s gaze to establish joint attention in real-world and virtual-reality settings. Fourth, we test the effectiveness and efficiency of various gaze displays to guide human gaze. Finally, we – in collaboration with project D01 – assess the interplay of gaze and gestures in such settings.