Abstract
This paper proposes a novel framework for human-robot collaboration (HRC) that addresses the critical need for robots to effectively collaborate with humans on shared tasks within unstructured and dynamic environments. While prior research focused on safety-related aspects, such as collision avoidance in shared workspaces, the task-oriented aspects of human-robot collaboration remain largely underexplored. To address this gap, our framework introduces Robot Assistance Primitives (RAPs), low-level robot actions that integrate both safety and task-related behaviors, enabling the robot to function as a collaborative “third hand” across physical and contactless interactions. A key component is an extension of impedance control with virtual force fields, unifying task guidance and collision avoidance. We leverage a state-of-the-art visual perception pipeline for real-time 3D scene understanding and an AR-HMD interface for multimodal task programming. We validate feasibility through technical experiments and conduct a user study on collaborative soldering and assembly, demonstrating significant improvements in efficiency and reduced cognitive load.