Last week the Hague Data Science Initiative was in New York City, attending UN-OCHA Centre for Humanitarian Data’s workshop on Predictive Analytics and the Future of Humanitarian Response.
The workshop included speakers from UN-OCHA, the World Bank and the Red Cross – all who have current and active humanitarian predictive analytics projects. Also presenting at the workshop was writer Cathy O’Neill, who wrote the seminal book: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.
So why are we interested in Predictive Analytics for Humanitarian Response?
Predictive analytics is the use of quantitative data to make predictions. While it isn’t a new field – statistics have used this method for a long time – the power and accuracy of these predictions have improved greatly with the advance of computer science. As well as having better access to much larger amounts of data, computer science has introduced algorithms to learn, improve and refine models which process and analyse this data and therefore make more reliable and robust predictions.
The private sector has been using predictive analytics for decades for things such as transportation (finding best routes), virtual assistants, chatbots and fraud detection.
The humanitarian sector recently recognised the potential of predictive analytics for humanitarian response. In March 2018, UN Under-Secretary General Mark Lowcock, publicly spoke of the humanitarian sector’s interest in the topic:
“What we need to do is to move from today’s approach, where we watch disaster and tragedy build, gradually decide to respond and then mobilize money and organizations to help; to an anticipatory approach where we plan in advance for the next crises, putting the response plans and the money for them in place before they arrive, and releasing the money and mobilizing the response agencies as soon as they are needed.”
This mobilised the humanitarian sector to further work on predictive analytics, particularly in areas such as humanitarian financing, impact assessment, predicting movements of people (refugees, migrants, IDPs), improving supply chain, identifying poor performers (therefore allocating more appropriate funding) and re-thinking labour costs (ie. how many field staff need to be where).
Hague-based UN-OCHA Centre for Humanitarian Data have made several notable steps towards utilising predictive analytics for humanitarian response. In April this year they held a two-day workshop in The Hague on the topic of predictive analytics. The aim of the workshop was to “exchange information about predictive analytics initiatives and to identify gaps, challenges and opportunities related to the application of predictive models in crises.” Presentations were made by six of the most advanced predictive analytics projects in the sector:
- Project Jetson by UNHCR
- Artemis (Famine Action Mechanism) by the World Bank
- Pilot model on funding needs for food insecurity by UN-OCHA
- Forecast-based-financing by the Red Cross
- Migration and Displacement Initiative by Save the Children
- Global Disaster Displacement Risk Model by Internal Displacement Monitoring Centre.
For more information about these projects, and to read a full report from the Hague workshop, click here.
Why the City of The Hague is involved:
The City of The Hague is a proud supporter of Artificial Intelligence for Good and hosts the Centre for Humanitarian Data. They are currently funding the Centre’s work on Predictive Analytics.
The Data Science Initiative is also working with the UN Migration Organisation (IOM) and other major UN organisations and universities to create an Ethical Framework for Artificial Intelligence projects for humanitarian response. More information on this project coming soon!
By Kate Dodgson, September 2019