This project focuses on designing a perception and navigation stack for the DJI Tello EDU quadcopter to autonomously navigate through three stages of an obstacle course in an autonomous drone race. For more details about the race track and different stages in it please refer to project 5. This project is a culmination of work done in project 3 and project 4 with an additional algorithm to deal with the dynamic window in the third stage of the race.
- Install Numpy, OpenCV, djitellopy, Ultralytics, torch, cudatoolkit, matplotlib libraries before running the code.
- Install all the library dependencies mentioned here
- Turn the drone on and connect to it.
- To run the main code run the
Wrapper.py
file after installing all dependancies. This will save the final output folders inCode
folder itself. - In Code folder:
python3 Wrapper.py --model=RAFT/models/raft-sintel.pth
- In the
Code
folder we have the corresponding model weights for phase 1 and phase 2 inYOLO Model
folder andRAFT
folders.
For detailed description see the report here.
Link to a demo run for a random configuration of windows. Link to the run on the final race day.
Chaitanya Sriram Gaddipati - cgaddipati@wpi.edu
Shiva Surya Lolla - slolla@wpi.edu
Ankit Talele - amtalele@wpi.edu