↓ Scroll to launch
Active Object Tracking on DuckieDrone
A Comprehensive Two-Part Approach
Introduction
This project explores two main ideas within the field of autonomous aerial tracking systems. The first objective is to enable a real Duckiedrone to perform AprilTag-based visual tracking. Using onboard RGB cameras and ROS middleware, the drone is able to detect an AprilTag marker in real time, estimate its 3D pose, and navigate accordingly. This allows the physical drone to follow a predefined target using visual feedback, laying the foundation for robust autonomous flight in controlled environments. The second objective focuses on a simulated drone environment, where a virtual drone is developed to perform active object tracking using YOLOv5. In this setup, a deep learning-based object detection model is used to recognize and track a moving object—such as a Duckiebot—in real time. The tracking data is then used to guide the drone's movements dynamically, simulating real-world conditions and validating the performance of machine learning-based tracking algorithms. By combining these two approaches, the project aims to demonstrate both traditional marker-based tracking with physical hardware and deep learning-driven tracking in simulation. The overall goal is to advance the development of autonomous drones capable of following dynamic targets in various scenarios, contributing to fields such as intelligent robotics, surveillance, and human–robot interaction.
Solution Methodology

AprilTag-Based Physical Tracking

In the physical setup, the Duckiedrone is equipped with a Raspberry Pi 3, an RGB camera, and essential flight control components. ROS is used as the middleware to manage communication between sensors, flight controller, and processing nodes. AprilTag markers are placed on the target (e.g., a Duckiebot), and the drone uses the apriltag_ros package to detect and estimate the 3D pose of the tag in real time. A PID-based control system adjusts the drone's position and orientation to keep the tag in view and follow it smoothly. The system is tested in controlled indoor environments to evaluate tracking precision, stability, and responsiveness.

Simulated Drone – YOLOv5-Based Active Object Tracking

A virtual drone is created using Gazebo and ROS to simulate realistic flight and camera behavior. The YOLOv5n model, optimized for lightweight performance, is trained to detect the target object (such as a Duckiebot) from simulated camera input. This detection output is processed to extract the object's location in the image frame, which is then used to control the drone's movement dynamically through velocity commands. The simulation allows testing under different motion patterns, lighting conditions, and levels of occlusion. This environment provides a safe and flexible platform to develop and validate deep learning-based tracking algorithms before real-world deployment.

Project Videos

Project demonstration videos showcasing the AprilTag tracking and YOLOv5 simulation systems.

AprilTag Tracking on DD21 DuckieDrone
YOLOv5s Tracking Simulation Demo
Project Poster

Official project poster summarizing the research objectives, methodology, and key findings.

Active Object Tracking on DuckieDrone Poster
📥 Download Poster
Final Report

Comprehensive final report detailing the complete project development, implementation, and results.

Active Object Tracking on DuckieDrone Report
📥 Download Report
Team Members
Advisor Photo
Ă–zgĂĽr Erkent
Project Advisor
Provides academic guidance and technical oversight for the autonomous drone tracking project. Expert in robotics, computer vision, and autonomous systems, offering strategic direction and ensuring research quality standards throughout the development process.
Student Photo
Abdullah Enes Yaman
AI Engineering Student
Student ID: 2200765012
đź”— GitHub Profile
Student Photo
Hikmet Mete Çelik
AI Engineering Student
Student ID: 2210765019
đź”— GitHub Profile
Acknowledgement
Special Thanks
The drones were provided by the project "Teaching SLAM with Autonomous Robots for Rescue Tasks" funded by Bridge to Turkey Fund, with sponsorship from NVIDIA.