top of page
Screenshot_20231105_183119_Gallery.jpg

Robotic Wildfire Hotspotting

IEEE-Logo.jpg

Over the past four years I have been working on a system of an unmanned aerial vehicle (drone) and an unmanned ground vehicle (rover) to autonomously monitor loose embers (hotspots) during wildfires. Linked here is our paper published in IEEE. To see the codebase for the project, check the Github linked here. 

This project was inspired by my father's experience as a forest firefighter in Venezuela. Wildfire hotspots are extremely dangerous, as they can reignite a forest fire, but current surveilling techniques are either extremely manual or expensive. The goal of this project was to automate wildfire hotspotting using a drone which provides an aerial view of the area, and then a rover (which would be a tractor or bulldozer) to go and investigate each of the aerially detected hotspots. The video to the right, is of the fully functioning prototype. I built the two vehicles from scratch and coded the obstacle avoidance, pathfinding, and communication software to make the system work together. 

The Drone

The drone's design evolved over several months (iterations shown below). The first prototype was completely manual. Some of the main issues were to find parts that balanced weight, power, and communication capabilities with compatible with sensors needed to detect the hotspots. The second and current prototype is fully autonomous using two separate computers (pixhawk flight controller and a raspberry pi).

Drone-Original-Design.jpg

Manually Controlled Drone

Frame:  YoungRC F450 Drone Frame

Motors: ReadytoSky 2212 920KV Brushless

Motor Controller:(ESC): iFlight SucceX-E 45A BLHel_S

Battery:  6000mAh Lipo Battery + XT60 plug

Remote Control:  FlySky FS-i6X

Remote Control Tx/Rx: FlySky RC Tx/Rx 

20220927_221228.jpg

Current Autonomous Drone

Frame: YoungRC F450 Drone Frame

Motors: ReadytoSky 2212 920KV Brushless

Motor Controller(ESC): ReadytoSky 40A Brushless speed ctrl

Batteries: 

 - Drone Flight instruments: 4S 3300mAh 14.8v Lipo

  - Raspberry Pi:  powerbank 10000mAh USB

Remote Control:  FrSky Taranis Q x7

Remote Control Tx/Rx: FrSky 2.4GHz Access R8 Pro

Telemetry: AeroSky Telemetry

Flight Computer/Controller: Pixhawk 2.4.8

Computer: Raspberry Pi 4 4GB (incl case for protection); running raspberry pi OS.

GPS: ReadytoSky M8N GPS

Camera: 1080p USB camera (connected to Raspberry Pi)

The Rover

The rover was built to be also autonomous using a pixhawk flight controller and a jetson nano equipped with ROS to be able to facilitate obstacle avoidance. 

Rover-evolution-2.jpg

Frame: 1/10 RC Crawler SCX 10 II incl motor

Motor Controller:(ESC):

- Traction: Fullfun RC Brushed speed ctrl 1/10 Traxxas

- Steering: 20KG Digital servo High Torque, 4.8-6.8v

Battery: 

           - Ground Motion instruments: 4S 3300mAh 14.8v Lipo

           - NANO NVIDIA:  powerbank 10000mAh USB

Remote Control:  FrSky Taranis Q x7

Remote Control Tx/Rx: FrSky 2.4GHz Access R8 Pro

Telemetry: AeroSky Telemetry

Ground Motion Computer/Controller: Pixhawk 2.4.8

Computer: NVIDIA Jetson NANO 4GB (incl wireless module, and case for protection) ; running Linux4Tegra OS

GPS: ReadytoSky M8N GPS

LIDAR: RPLidar Slamtec - A1M8 2D 360 Degree, 12 meters rad

Cables: Many connectors

System Overview 

The system consists of three main components: a drone, a rover, and a ground station (The Drone and Rover are seen in images below). All devices are connected through a server powered by Apache Kafka to which data and commands are sent. The order of operations for the system is:

  1. The drone lifts off, travels to its designated waypoint, and takes a photo of the area in which the rover will be moving.

  2. The image is then sent to the ground station which creates a path using a modified pixel weighting A* pathfinding algorithm. The path includes points of interest detected by the drone (suspected hotspots). Each pixel in the path is converted to GPS coordinates.

  3. The GPS coordinates are then sent to the rover. The rover follows the path and adjusts to any obstacles overlooked by A* with an obstacle-avoidance algorithm.

Publication

My friend, Charles Jiang and I had the opportunity of publishing our project within IEEE. Click here to see the presentation we presented. 

bottom of page