Autonomous ground vehicle (AGV) following system using YOLOv11 and Range Imaging
Real-time tracking and following in the sim
Developed a camera-based human tracking pipeline to enable the safe and efficient relocation of airport robots. This algorithm detects and tracks human movement in real time using Ultralytic's YOLOv11 model and a 3D depth camera to create a dynamic path plan.
The system successfully worked on the sim as well as the real robot. In the future, I plan to enhance the system's capabilities by integrating gesture-based control to remove the dependence of a carried mobile device and enabling more intuitive human-robot interaction. Additionally, implementing facial recognition will allow the system to identify authorized personnel, ensuring only those approved can be followed which will enhance overall security.