When Alex Battikha (11) walked along the San Diego shores with his dad last June, he noticed trash everywhere. What further concerned him was that the trash was not only on the beach, but also far out in the ocean where it would be difficult to clean up. The observation prompted Battikha to wonder what technology currently exists to address pollution in different environments and found upon research that solutions were extremely limited. So, he took it upon himself to create a solution of his own.
“I wanted to come up with a system that’s able to address environmental pollution in different types of environments, not just oceanic pollution, and can do it over a very large region,” Battikha said. “So, that made me start exploring drones.”
Battikha’s idea was to create an unmanned aerial vehicle (UAV) with an AI computer vision system that could determine the location of trash and send that location information back to its user in real time. He said that he aimed to revolutionize drone-based environmental monitoring while prioritizing cost-effectiveness and scalability, knowing that the process would not be simple. He had to build a drone, figure out the machine learning aspect, and collect data for it, all in one summer before the rush of his junior year started.
“I actually started completely from scratch with a single carbon fiber rod and my laptop,” Battikha said. “I’ve been doing robotics for about 10 years now, so I was able to apply a lot of past experience to my project.”
Battikha began his project by using computer-aided design software to design the UAV base frame in 3D, which he later 3D printed using carbon fiber filament. He also had to consider cost, balance, and compatibility. To ensure that his design was optimal, he sought input from professionals.
“I reached out to an engineer from NASA who reviewed the design before it was fabricated, and then I assembled it on my own during the summer,” Battikha said.
The drone took two months to design and build. When he wasn’t building his drone, Battikha worked on his code to detect pollution. The main problem he encountered was with his vision system.
“I think one of the biggest challenges was making the vision system robust and trying to adapt it to different types of environments,” he said.
Battikha initially utilized YOLOv8 architecture, a deep-learning computer vision model that supports training models for real-time object detection, classification, and segmentation. However, he found that it required too much training data to be efficient.
“The very first iteration required me to label 5,000 images, which was very tedious,” Battikha said. “I wanted to make [the system] very fast for other scientists and researchers to label and train it and then eventually deploy it on the drone, so I made a lightweight version of YOLOv8, which just augments the data and exposes it to a different type of environment.”
The newly designed vision system achieved an accuracy of 95.7% and decreased the amount of images needed for training by 90%. Unlike other vision detection system models, Battikha’s required much less data and could be iterated in a very short time.
“I was comparing my drone to one of the commercial products from a company called Parrot, which makes similar drones, but their whole system costs $4,500 and their processing speeds are about half the rate I was able to achieve,” Battikha said. “My total cost was $1,000.”
To use Battikha’s computer system and drone, its user would first go out into the designated environment to be monitored and record a one to two-minute video of different types of plastics they are trying to detect. The user would then process that data by showing the computer what they were trying to detect by using bounding boxes, rectangles that surround an object and specify their location. Battikha’s vision model would then autonomously generate an algorithm to find where the objects were within the drone’s frame of reference.
“The vision model would be running on the drone, and when flying the drone, the vision model is simultaneously detecting those objects and sending the user back that data in real time,” he said.
Battikha sees incredible promise in his technology and hopes to continue improving it.
“I think it’s a revolution of environmental monitoring using aerial vehicles because they can see such a large distance and go so far, but I think it’s unique in the sense that it can also be adapted to other regions, so I think the potential is very high,” Battikha said. “The end goal is to use this drone with an autonomous submarine to go out, detect plastic, and then actually bring it back to land.”