Development of an Intuitive Glove-based Human Interface for Robotic Applications

Bio-Robotics

Research article

Development of an Intuitive Glove-based Human Interface for Robotic Applications

Liddell, H., & Secco, E. L. (2025). Development of an Intuitive Glove-based Human Interface for Robotic Applications. Bio-Robotics, 1(1), 16–31. https://doi.org/10.54963/br.v1i1.1303

Authors

  • Harrison Liddell

    School of Computer Science and the Environment, Liverpool Hope University, Hope Park, Liverpool L16 9JD, UK
  • Emanuele Lindo Secco

    School of Computer Science and the Environment, Liverpool Hope University, Hope Park, Liverpool L16 9JD, UK

Received: 9 April 2025; Revised: 20 May 2025; Accepted: 27 May 2025; Published: 7 June 2025

This paper focuses on developing an innovative glove that enables users to control robots through natural hand gestures. The primary goal is to simplify human-robot interaction, allowing individuals to communicate with robots without extensive training or technical knowledge. The glove has three types of sensors: an ultrasonic sensor for measuring distances, flex sensors for tracking finger movements, and a GY-521 accelerometer for monitoring the hand's position and motion. By integrating these technologies, the glove translates simple hand gestures into precise commands for robotic systems, making it a powerful tool for various applications. To ensure the glove's effectiveness, the project began with a thorough review of existing research in human-robot interaction. This literature review helped identify the best features and methods to incorporate into the glove's design. The development process involved designing the glove, creating a prototype, and rigorously testing it to confirm that all sensors functioned correctly and that the glove was comfortable to wear. During the testing phase, each sensor was evaluated individually and combined with the others. The aim was to verify that they accurately captured data in real time and worked together seamlessly. The expected outcomes include the glove's ability to effectively recognize hand movements, detect objects, and allow users to control robots with ease. This project attempts to bridge the gap between human actions and robotic responses, making technology more accessible and user-friendly. In summary, the glove-based control system has the potential to transform how individuals interact with robots. By utilizing simple hand gestures, users can perform complex tasks more effortlessly, which could be particularly beneficial in fields such as healthcare, manufacturing, and assistive technology. The research aims to pave the way for more effective and engaging robotic systems that cater to a wide range of needs, enhancing how people live and work alongside technology.

Keywords:

Wearable Technology Human-Robot Interaction Biologically-Inspired Design Bio Mimetics

References

  1. Gunawardane, P.D.S.H.; Medagedara, N.T. Comparison of Hand Gesture Inputs of Leap Motion Controller & Data Glove Into a Soft Finger. In Proceedings of the 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Ottawa, Canada, 5–7 October 2017.
  2. Xu, Z.; Yang, C.; Wu, W.; et al. Design of Underwater Humanoid Flexible Manipulator Motion Control System Based on Data Glove. In Proceedings of the 2020 6th International Conference on Mechatronics and Robotics Engineering (ICMRE), Barcelona, Spain, 12–15 February 2020.
  3. Boka, T.; Nikkhah, M.N.; Moosavian, S.A.A. KNTU-RoboGlove: Design of a Pneumatic Soft Robotic Glove. In Proceedings of the 2023 11th RSI International Conference on Robotics and Mechatronics (ICRoM), Tehran, Iran, 19–21 December 2023.
  4. Du, Q.; Zhao, W.; Cui, X.; et al. Design, Control and Testing of Soft Pneumatic Rehabilitation Glove. In Proceedings of the 2020 3rd World Conference on Mechanical Engineering and Intelligent Manufacturing (WCMEIM), Shanghai, China, 4–6 December 2020.
  5. Tran, P.; Jeong, S.; Herrin, K.; et al. Flexotendon Glove-III: Soft Robotic Hand Rehabilitation Exoskeleton for Spinal Cord Injury. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China, 30 May–5 June 2021.
  6. Lee, S.; Park, K.; Lee, J.; et al. User Study of VR Basic Controller and Data Glove as Hand Gesture Inputs in VR Games. In Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), Nara, Japan, 27–29 June 2017.
  7. Ghate, S.; Yu, L.; Du, K.; et al. Sensorized Fabric Glove as Game Controller for Rehabilitation. In Proceedings of the 2020 IEEE Sensors, Rotterdam, Netherlands, 25–28 October 2020.
  8. TDK InvenSense. Available online: https://invensense.tdk.com/products/motion-tracking/6-axis/mpu-6050/ (accessed on 16 July 2024).
  9. Makerguides. Available online: https://www.makerguides.com/hc-sr04-arduino-tutorial/ (accessed on 16 July 2024).
  10. Flexpoint. Available online: https://flexpoint.com/product/bend-sensor/ (accessed on 16 July 2024).
  11. Arduino. Available online: https://forum.arduino.cc/t/problem-getting-gy-521-to-work-mpu-6050/692358/8 (accessed on 16 July 2024).
  12. Arduino. Available online: https://www.arduino.cc/en/Main/ArduinoBoardUno (accessed on 16 July 2024).
  13. SparkFun Electronics. Available online: https://www.sparkfun.com/products/9139 (accessed on 16 July 2024).
  14. Adafruit. Available online: https://www.adafruit.com/product/4871 (accessed on 16 July 2024).
  15. UCLA Newsroom. Available online: https://newsroom.ucla.edu/releases/glove-translates-sign-language-to-speech (accessed on 08 April 2025).
  16. Fels, S.S.; Hinton, G.E. Glove-Talk II – A Neural-Network Interface Which Maps Gestures to Parallel Formant Speech Synthesizer Controls. IEEE Trans. Neural Netw. 1997, 8, 977–984. DOI: https://doi.org/10.1109/72.623199
  17. Mosier, K.M.; Scheidt, R.A.; Acosta, S.; et al. Remapping Hand Movements in a Novel Geometrical Environment. J. Neurophysiol. 2005, 94, 4362–4372. DOI: https://doi.org/10.1152/jn.00380.2005
  18. Secco, E.L. Movement Control of a 3 D.O.F. Artificial Finger: Dynamic Learning and Execution of the Natural Movement. PhD Thesis, University of Pavia, Pavia, Italy, 2002.
  19. Buckley, N.; Sherrett, L.; Secco, E.L. A CNN Sign Language Recognition System with Single & Double-Handed Gestures. In Proceedings of the IEEE 45th Annual Computers, Software, and Applications Conference (COMPSAC), Madrid, Spain, 12–16 July 2021. DOI: https://doi.org/10.1109/COMPSAC51774.2021.00173
  20. Manolescu, D.; Mutinda, B.; Secco, E.L. Human–Robot Interaction via Wearable Device – A Wireless Glove System for Remote Control of 7-DoF Robotic Arm. Acad. Eng. 2024, 1, 1–8. DOI: https://doi.org/10.20935/AcadEng7350
  21. Latif, B.; Buckley, N.; Secco, E.L. Hand Gesture & Human-Drone Interaction. In Proceedings of the SAI Intelligent Systems Conference, Amsterdam, Netherlands, 1–2 September 2022.
  22. Chu, T.S.; Chua, A.Y.; Secco, E.L. A Study on Neuro Fuzzy Algorithm Implementation on BCI-UAV Control Systems. ASEAN Eng. J. 2022, 12, 75–81. DOI: https://doi.org/10.11113/aej.v12.16900
  23. Manolescu, D.; Al-Zu’bi, H.; Secco, E.L. Interactive Conversational AI with IoT Devices for Enhanced Human-Robot Interaction. J. Intell. Commun. 2025, 3, 74–92. DOI: https://doi.org/10.54963/jic.v3i1.317