Paper WeAT3.4
Kozinov, Andrey (Villanova University), Marielle, Bersalona (Villanova University), Clayton, Garrett (Villanova University)
Human As Sensor: Correlating Head Movements from a VR-Immersed Observer to Robot Orientation
Scheduled for presentation during the Invited Session "Design and control coupling in deformable mechatronic and robotic systems for physical interactions with humans" (WeAT3), Wednesday, July 16, 2025,
11:00−11:20, Room 107
Joint 10th IFAC Symposium on Mechatronic Systems and 14th Symposium on Robotics, July 15-18, 2025, Paris, France
This information is tentative and subject to change. Compiled on July 16, 2025
|
|
Keywords Interfaces (Visual, Haptic, …), Sensors and Measurement Systems, Robot Navigation, Programming, and Vision
Abstract
In this work, we examine how unconscious head movements, evoked when a human experiences a robot’s point-of-view through virtual reality (VR), relate to the robot’s own motion. This "human as sensor" human-robot collaboration concept leverages human sensing to provide additional measurement to the robotic system. This paper presents preliminary experimental results where a human views recorded stereo footage, obtained from a ground robot outfitted with a stereo camera, via a VR headset equipped with an inertial measurement unit. Preliminary analyses show a moderate negative correlation between human head and robot pitch angles, while the roll and yaw directions exhibit weaker negative correlations. These results suggest that the participant’s head naturally moves in response to the robot’s observed motion, an unconscious vestibular-like response triggered by visual cues, and could be leveraged for robot orientation measurement.
|
|