Skip to main content

Developer uses drone and AI to find and recognize SOS messages

Here’s a cool story of drones being used for good! A developer is using aerial images taken with a drone, a DJI Mavic Pro (DJI, Amazon) in combination with artificial intelligence (AI) software to recognize SOS messages painted on the streets in Puerto Rico.

Finding and recognizing SOS messages by drone

A developer’s journey from attending a Call for Code hackathon to open-sourcing drone tech as one of Code and Response’s first projects

By Pedro Cruz, IBM Developer Advocate and Founder of DroneAid

On September 20, 2017, Hurricane Maria struck my home, Puerto Rico. After surviving the record-breaking Category 5 storm and being personally affected by its aftermath, I decided I was going to make it my mission to create technology that could help mitigate the impact hurricanes have on our island.

Inspired by Call for Code

Can you imagine trying to plan relief efforts for more than three million people? People in rural areas, including a community in Humacao, Puerto Rico, suffered the most. The people in this community were frustrated that help was promised but never came. So, the community came together and painted “water” and “food” on the ground as an SOS, in hope that helicopters and planes would see their message. For me, it was sad and frustrating to see that the reality outside of the metro area was different. Lives were at risk.

Fast-forward to August 2018. Less than a year after the hurricane hit, I attended the Call for Code Puerto Rico Hackathon in Bayamón, Puerto Rico. I was intrigued by this global challenge that asks developers to create sustainable solutions to help communities prepare for, respond to, and recover from natural disasters.

The SOS messages after the Hurricane inspired me to develop DroneAid, a tool that uses visual recognition to detect and count SOS icons on the ground from drone video streams overhead, and then automatically plots the emergency needs captured via video on a map for first responders. I thought that drones could be the perfect solution for rapidly assessing damages from the air and they could help with capturing images that could then be processed by AI computer vision systems. At first, I thought of using OCR (optical character recognition) technologies to detect letters. The problem with this approach is that everyone has different handwriting. If we want this to work in other languages, it will be very complex.

After a few hours of coding, I pivoted and decided to simplify the visual recognition to work with a standard set of icons. These icons could be drawn with spray paint, chalk, or even placed on mats. Drones could detect those icons and communicate to first responders on a community’s specific needs for food, water, and medicine. I coded the first iteration of DroneAid at that hackathon and won first place. This achievement pushed me to keep going. In fact, I joined IBM as a full-time developer advocate.

DroneAid is so much more than a piece of throwaway code from a hackathon. It’s evolved into an open-source project that I am excited to announce today. I’m thrilled that IBM is committed to applying our solution through Code and Response, the company’s unique $25 million program dedicated to the creation and deployment of solutions powered by open source technology to tackle the world’s biggest challenges.

Open-sourcing DroneAid through Code and Response

DroneAid leverages a subset of standardized icons released by the United Nations. These symbols can either be provided in a disaster preparedness kit ahead of time or recreated manually with materials someone may have on hand. A drone can survey an area for these icons placed on the ground by individuals, families, or communities to indicate various needs. As DroneAid detects and counts these images, they are plotted on a map in a web dashboard. This information is then used to prioritize the response of local authorities or organizations that can provide help.

From a technical point of view, that means that a visual recognition AI model is trained on the standardized icons so that it knows how to detect them in a variety of conditions (i.e. whether they are distorted, faded, or in low light conditions). IBM’s cloud annotations tool makes it straightforward to train AI using IBM Cloud Object Storage. This model is applied to a live stream of images coming from the drone as it surveys the area. Each video frame is analyzed to see if any images exist. If they are, their location is captured and they are counted. Finally, this information is plotted on a map indicating the location and number of people in need.

The system can be run locally by following the steps in the source code repository, starting with a simple Tello drone (DJI, Amazon) example. Any drone that can capture a video stream can be used since the machine learning model leverages Tensorflow.js in the browser. This way we can capture the stream from any drone and apply inference to that stream. This architecture can then be applied to larger drones, different visual recognition types, and additional alerting systems.

Calling all developers to collaborate in the DroneAid open source community

It’s been quite a journey so far and I feel like we’re just getting started. Let’s unite to help reduce loss of life, get victims what they need in a timely manner and help reduce the overall effects a natural disaster will have on a community.

Our team decided to open source DroneAid because I feel it’s important to make this technology available to as many people as possible. The standardized icon approach can be used around the world in many natural disaster scenarios (i.e., hurricanes, tsunamis, earthquakes, and wildfires) and having developers contribute by training the software on an ongoing basis can help increase our efficiency and expand how the symbols can be used together. We built the foundation for developers to create new applications and envision using this technology to deploy and control a fleet of drones as soon as a natural disaster hits.

Now that you understand how DroneAid can be applied to help communities in need, join us and contribute here: https://github.com/code-and-response/droneaid

This article was also posted on IBM’s Developer blog.

Stay in touch!

If you’d like to stay up to date with all the latest drone news, scoops, rumors and reviews, then follow us on TwitterFacebookYouTubeInstagram or sign up for our daily email newsletter, that goes out every weekday at 6 p.m. ET.

Buy your next drone directly from manufacturers, such as DJIParrotYuneec or retailers like Adorama, AmazonB&HBestBuy, DroneNerds or eBay. By using our links, we will make a small commission at no additional cost to you. Thank you for helping DroneDJ grow!

FTC: We use income earning auto affiliate links. More.

You’re reading DroneDJ — experts who break news about DJI and the wider drone ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow DroneDJ on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel.

Comments

Author

Avatar for Haye Kesteloo Haye Kesteloo

Haye Kesteloo is the Editor in Chief and Main Writer at DroneDJ, where he covers all drone related news and writes product reviews. He also contributes to the other sites in the 9to5Mac group such as; 9to5Mac, 9to5Google, 9to5Toys and Electrek. Haye can be reached at haye@dronedj.com or @hayekesteloo 


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications