Back to Projects

Camera-Based Human Tracking and Following Algorithm for Mobile Robots

Autonomous ground vehicle (AGV) following system using YOLOv11 and Range Imaging

ROS Python HTML JS OpenCV Gazebo Computer Vision Deep Learning YOLO HRI CUDA

Project Overview

Developed a camera-based human tracking pipeline to enable the safe and efficient relocation of airport robots. This algorithm detects and tracks human movement in real time using Ultralytic's YOLOv11 model and a 3D depth camera to create a dynamic path plan.

Technical Implementation

Key Features

Results & Future Work

The system successfully worked on the sim as well as the real robot. In the future, I plan to enhance the system's capabilities by integrating gesture-based control to remove the dependence of a carried mobile device and enabling more intuitive human-robot interaction. Additionally, implementing facial recognition will allow the system to identify authorized personnel, ensuring only those approved can be followed which will enhance overall security.

View Code