Project Details

The Challenge | Spot That Fire V2.0

Your challenge is to create an application that leverages NASA's near-real-time and archival wildfire datasets along with other tools to support firefighting and fire mitigation efforts. This challenge builds on last year’s challenge of the same name by calling for innovative ideas and apps that focus on how to engage and enable citizens to assist with the entire firefighting and fire mitigation process.

FireTracker

Leverage crowdsourcing to mitigate wildfires, providing safety measures to citizens and relevant datas to authorities .

Fire Tracker

In a time of climate change and in a world facing numerous ecological challenges, solutions can't come from individuals but from the community.



Over the past few years, wildfires have become increasingly more covered by media and public awareness is growing, as demonstrated by the global interest in the Amazonian's wildfires this summer.
And on the other hand, fields such as big datas, data mining, and crowdsourcing can provide very efficient tools to make every small individual action matter.

This is what FireTracker is doing: leveraging datas to allow citizens to be part of the firefighting efforts, involving them in the world they are living in.

But how is FireTracker doing it ?

  • Allow citizens to report and photograph any wildfire they might encounter, via a simple touch on the FireTracker App.
  • Ensure the safety of the individual, indicating him the path to the closest safe place.
  • Using GPS coordinates, cross check the signalement with the FIRMS database (Fire Information for Ressource Management System - NASA).
  • Use of Machine Learning to avoid spam (detection of fire and smoke on the photograph)
  • Notify firefighters and give them added informations, such as Near-Real Time satellite imagery from the Sentinel (ESA) database, and exact position of the fire.

What is the underlying magic ?

The FireTracker App relies solely on open source API and benefits from the open data policies of ESA and NASA, bringing together the best of each world !
Moreover, it uses state of the art imagery and machine learning techniques in order to treat datas accurately.

How did we build it ?

Servers: Firebase / Cloudinary
App development: Kodular
Processing code language: Python


Some references !

  • Sentinel Hub: API to access Sentinel data imagery.
  • SentinelHub-py: Python access to the Sentinel API.
  • FIRMS: Access to FIRMS API.
  • AlexNet : Literature on the Neural Network used for fire recognition.
  • Kodular: Documentation on Kodular, used to design the app.


Github, because sharing is caring !

Github