OpenCV-Enabled Real-Time Object Tracking Robotic Manipulator with Eye-Gaze Control for Assistive Applications

International Journal of Emerging Research in Science, Engineering, and Management
Vol. 2, Issue 4, pp. 109-113, April 2026.

https://doi.org/10.58482/ijersem.v2i4.15

OpenCV-Enabled Real-Time Object Tracking Robotic Manipulator with Eye-Gaze Control for Assistive Applications

A Pushpa Latha, R Tharun Teja, B Vamsi, P Uha, P Varshini, M Gnana Priya

1-5Department of ECE, Gokula Krishna College of Engineering, Sullurpet, AP, India.

6Associate Professor, Department of ECE, Gokula Krishna College of Engineering, Sullurpet, AP, India.

Abstract: : This paper presents the design and implementation of an OpenCV-enabled real-time object tracking robotic manipulator aimed at assisting physically challenged individuals. The system integrates computer vision techniques with embedded systems to enable intuitive human-machine interaction through eye-gaze control. A USB camera captures real-time video input, which is processed using OpenCV along with advanced algorithms such as MediaPipe and YOLO for object detection and eye tracking. Eye gestures, including gaze direction and blinking, are interpreted as control commands to select and track objects within the environment. The robotic manipulator consists of a four-degree-of-freedom (DOF) arm incorporating base, shoulder, elbow, and gripper mechanisms. A Raspberry Pi 4 serves as the central processing unit, executing image processing tasks and generating control signals for servo motor actuation. Based on the detected object and user input, the system enables precise tracking and manipulation of objects in real time. Experimental results demonstrate that the system achieves reliable object detection, efficient tracking, and responsive actuation with low latency. The proposed solution offers a cost-effective, portable, and user-friendly assistive technology, significantly enhancing independence and quality of life for physically challenged individuals.

Keywords: OpenCV, Real-Time Object Tracking, Robotic Manipulator, Eye-Gaze Control, Assistive Technology.

References: 

  1. G Fischer-Janzen, T. M. Wendt, and K. Van Laerhoven, “A scoping review of gaze and eye tracking-based control methods for assistive robotic arms,” Frontiers in Robotics and AI, vol. 11, p. 1326670, Feb. 2024, doi: 10.3389/frobt.2024.1326670.
  2. K. Kausalya, S. Rajaraman, V. Nandhakumar, S. Surya and R. Shrayas, “Gazecon – Assistive control system for paralyzed people using a vision-based eye-gaze tracking,” 2024 International Conference on Advances in Computing, Communication and Applied Informatics (ACCAI), Chennai, India, 2024, pp. 1-6, doi: 10.1109/ACCAI61061.2024.10602260.
  3. F. Di Stefano, A. Giambertone, L. Salamina, M. Melchiorre, and S. Mauro, “Collaborative robot control based on human gaze tracking,” Sensors, vol. 25, no. 10, p. 3103, May 2025, doi: 10.3390/s25103103.
  4. V. V. Reddy, T. Dhyanchand, G. V. Krishna and S. Maheshwaram, “Virtual Mouse Control Using Colored Finger Tips and Hand Gesture Recognition,” 2020 IEEE-HYDCON, Hyderabad, India, 2020, pp. 1-5, doi: 10.1109/HYDCON48903.2020.9242677.
  5. M. S. H. Sunny et al., “Eye-gaze control of a wheelchair mounted 6DOF assistive robot for activities of daily living,” Journal of NeuroEngineering and Rehabilitation, vol. 18, no. 1, p. 173, Dec. 2021, doi: 10.1186/s12984-021-00969-2.
  6. V. K. Sharma, L. R. D. Murthy, K. S. Saluja, V. Mollyn, G. Sharma, and P. Biswas, “Webcam controlled robotic arm for persons with SSMI,” Technology and Disability, vol. 32, no. 3, pp. 179–197, Jun. 2020, doi: 10.3233/tad-200264.
  7. Y. -S. L. -K. Cio, M. Raison, C. Leblond Ménard and S. Achiche, “Proof of Concept of an Assistive Robotic Arm Control Using Artificial Stereovision and Eye-Tracking,” in IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 27, no. 12, pp. 2344-2352, Dec. 2019, doi: 10.1109/TNSRE.2019.2950619.
  8. M. Dahmani et al., “An intelligent and Low-Cost Eye-Tracking system for motorized wheelchair control,” Sensors, vol. 20, no. 14, p. 3936, Jul. 2020, doi: 10.3390/s20143936.
  9. V. M. Montaño-Serrano, J. M. Jacinto-Villegas, A. H. Vilchis-González, and O. Portillo-Rodríguez, “Artificial Vision Algorithms for Socially Assistive Robot Applications: A Review of the Literature,” Sensors, vol. 21, no. 17, p. 5728, Aug. 2021, doi: 10.3390/s21175728.
  10. J. D. O. Go, N. G. T. Ong, C. A. Rafanan, B. G. Tan, and T. S. C. Chu, “MoMa: An assistive mobile manipulator with a webcam-based gaze control system,” HardwareX, vol. 20, p. e00599, Oct. 2024, doi: 10.1016/j.ohx.2024.e00599.
  11. C. Joseph, S. B. Dhanalakshmi, S. N. Raj, N. S. Priyadarshini, R. M. Gayathri and V. S. D. Dharan, “A Wearable Eye-Tracking Assistive Communication System for Paralyzed Individuals with Speech Impairment,” 2025 International Conference on Intelligent Innovations in Engineering and Technology (ICIIET), Coimbatore, India, 2025, pp. 1-6, doi: 10.1109/ICIIET65921.2025.11373923.
  12. E. Iáñez, A. Úbeda, J. M. Azorín, and C. Perez-Vidal, “Assistive robot application based on an RFID control architecture and a wireless EOG interface,” Robotics and Autonomous Systems, vol. 60, no. 8, pp. 1069–1077, May 2012, doi: 10.1016/j.robot.2012.05.006.