Projects

The recent 2018 California wildfires created a havoc, where in just a couple of months few incidents destroyed a total area of 1,250,467 acres, the Mendocino complex fire being the worst cases of all time, according to a report by the California Department of Forestry. With a substantial increase in annual disasters, especially climate-related ones, our disaster relief, and emergency response techniques are being tested, and they call for an improvement. Four technology enthusiasts, Tyler Gragg, Josh Dittmar, Michael Henley and David Beaudette set out to find a solution to this problem using drones at AUVSI’s Xbuild Hackathon.

Their solution — Drone Doctor aimed at conducting a rapid remote triage from a drone video feed, at the scene of a mass casualty. The team considered use cases like a mass shooting, trainwreck with hazardous materials, chemical attack, etc., and they found an innovative way to control a drone, with limited dexterity, which is common with Hazmat and similar anti-exposure suits.

What did they build?

With a disaster like situation, chaos, and panic erupts, which makes it a challenging situation for the first responder teams to act effectively. More often than not they walk into dangerous situations with little idea of what to expect. The aim of the solution was to send a fleet of autonomous drones acting as the first response to a scene of the disaster. The drones collect information about the severity of the scene (the number of injured people, age, gender and other basic information) and send the recorded data to the first responders so that they can act in a decisive and efficient manner.

Gesture Controlled Drones

Alternatively, drones can also be used with hazmat and anti-exposure suits for on-site assessment and triage. As these suits have limited dexterity, it is difficult to control with a usual remote-controller. To overcome this hurdle the team integrated a Myo Armband — a wearable gesture control and motion control device, making it easier to maneuver the drone, complementing the limited dexterity of hazmat suits.

Following are the three designated steps their system goes through, to conduct a successful remote triage at the scene of a mass casualty.

  • Firstly the team configures the launch and control of the drone using the Myo armband interfaced with Flytbase to arrive at the disaster site, reducing the control dexterity of the drone.
  • The photo and video feed of the mass casualty scene from the drone are tagged with location metadata and are then sent from the onboard mission computer (raspberry pi) via data links to the IBM Cloud (Watson).
  • IBM Watson uses generic and remote triage custom classifier to assess the image for initial triage and identify a person as prone, sitting or standing. Finally, scene intelligence is pushed to the on-scene commander for the allocation of resources which are necessary to manage the mass casualty situation.

Challenges faced:

The team took up mass casualty situations with a higher likelihood, like mass shooting and chemical attacks. To handle a similar situation, the team divided the functioning of the triage into the following segment: SALT: Sort, Assess, Life-saving, Treatment/ Transport.

In addition to that, SORT included the following assessments.

Assess 1 – Still or Obvious life threat
Assess 2 – Wave or powerful movement
Assess 3 – Ambulatory

The main challenge that the team faced, was the seamless integration of the three different technologies viz. Drones, Gesture Sensor and AI Engine, and creating an automated workflow. Firstly, the movements of the drone needed to be interfaced with the Myo Armband, enabling remote control of the drones, with the hazmat suits. The second part was to send the captured data through the cloud, for post-processing, which required the integration of IBM Watson with their workflow.

How did FlytBase help?

FlytBase provided them the platform to integrate drones with the MYO Armband and seamless capture & send data to the cloud. Later the data is fed into IBM Watson for scene recognition using AI. The team used FlytAPIs to achieve this and for sending navigation commands, and viewing live telemetry of the drones. The FlytSim/virtual drone simulator made it easier for them to simulate the drones in a safe, secure and reliable way.

flytbase smart drone automation

“Working with FlytBase was super easy for us. I click a button and instantly I have a simulated aircraft on a cloud-based platform. We used the FlytSim software to launch a simulated drone and used navigation and telemetry APIs to control the drone using a Myo armband interfaced with FlytBase to arrive at the disaster site.” – Team Drone Doctor.

In spite of all the challenges, the team configured the entire system within the 24 hours duration of the hackathon. They were able to showcase an impressive live demo of the solution, exhibiting an automated workflow. Team Drone Doctor was declared second runners-up at the Xbuild Hackathon.

Wish to conduct a drone hackathon and need FlytBase as partner/sponsor? Visit flytbase.com/flytcode and write to us.

Build automated and scalable drone applications with FlytBase, better and faster. Sign Up to download FlytBase eBrochure.

Write to us at info@flytbase.com or schedule a demo with our expert to learn more.

%d bloggers like this: