Continuous Gesture Control of a Robot Arm: Performance Is Robust to a Variety of Hand-to-Robot Maps Article

Khan, SE, Danziger, ZC. (2024). Continuous Gesture Control of a Robot Arm: Performance Is Robust to a Variety of Hand-to-Robot Maps . IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 71(3), 944-953. 10.1109/TBME.2023.3323601

cited authors

  • Khan, SE; Danziger, ZC

abstract

  • Objective: Despite advances in human-machine-interface design, we lack the ability to give people precise and fast control over high degree of freedom (DOF) systems, like robotic limbs. Attempts to improve control often focus on the static map that links user input to device commands; hypothesizing that the user's skill acquisition can be improved by finding an intuitive map. Here we investigate what map features affect skill acquisition. Methods: Each of our 36 participants used one of three maps that translated their 19-dimensional finger movement into the 5 robot joints and used the robot to pick up and move objects. The maps were each constructed to maximize a different control principle to reveal what features are most critical for user performance. 1) Principal Components Analysis to maximize the linear capture of finger variance, 2) our novel Egalitarian Principal Components Analysis to maximize the equality of variance captured by each component and 3) a Nonlinear Autoencoder to achieve both high variance capture and less biased variance allocation across latent dimensions Results: Despite large differences in the mapping structures there were no significant differences in group performance. Conclusion: Participants' natural aptitude had a far greater effect on performance than the map. Significance: Robot-user interfaces are becoming increasingly common and require new designs to make them easier to operate. Here we show that optimizing the map may not be the appropriate target to improve operator skill. Therefore, further efforts should focus on other aspects of the robot-user-interface such as feedback or learning environment.

publication date

  • March 1, 2024

Digital Object Identifier (DOI)

start page

  • 944

end page

  • 953

volume

  • 71

issue

  • 3