Global AI and Data Science

 View Only

Disaster Response and Management: IBM Journal of Research and Development

By Susan Malaika posted Thu March 12, 2020 01:30 PM

  
IJRD-64 includes 16 articles on Disaster Response and Management

Volume 64 of IBM Journal of Research and Development
Disaster Response and Management


Volume 64  of the IBM Journal of Research and Development (IJRD-64) was published in February 2020. This issue is focused on the effects and measurements that help in the understanding and mitigation of disasters, as well as more efficient use of scarce funds for disaster relief. The issue contains 16 articles on disaster response and management. I highlight and enclose one of the articles that I had the pleasure of writing with my excellent colleague Daniel Krook entitled "Call for Code: Developers tackle natural disasters with software".

The article describes the first instance of Call for Code hackathon in 2018. It was a long-running contest that took place from May to September 2018 and focused on building solutions for disaster preparedness and recovery. The Call for Code effort arose from an understanding of the global needs for disaster response and the altruistic nature of many developers and data scientists to deploy their skills for humanitarian good. Another significant trend Call for Code supports is that working under constraints, such as in a contest or hackathon, generates innovation. The effort also appeals to developers and data scientists because it fosters the use of new technologies and collaboration.  You can download the PDF for the Call for Code article. Call for Code took place again in 2019 with over 180,000 participants from 165 nations taking part. Call for Code was kicked off again in February 2020 at the Office of the United Nations High Commissioner for Human Rights in Geneva. I was fortunate to spend time with outstanding colleagues like Shari Chiara, Liz Klipp, Daniel Krook, Binu Midhun, Henry Nash, Stephanie Parkin, Developer Steve, Sarah Storelli, Josh Zheng and a fantastic team including  Amanda Kron and David Zervaas from the United Nations, Markus Eisele from Red Hat, Antoine Marin from Nearform, and John Walicki from IBM, at the Palais Wilson on Lake Geneva working on Call for Code 2020 - when a rainbow unexpectedly appeared! The rainbow was appropriate as we are the Call for Code Water Sustainability team. You can join us in the Call for Code 2020 initiative here: https://callforcode.org/

Below, I share the abstracts for all the articles in the Disaster Response and Management issue. These articles provide content that will likely be of interest to Call for Code 2020 participants. You can get full copies of the articles and journal from the IEEE website for a fee unless you subscribe or are a member. IBMers have access to digital copies.
I will update this blog soon if I find other ways of getting digital copies of the articles or physical copies of the journal.


Call for Code : In Geneva at the Palais Wilson with a rainbow - February 2020



Titles and Abstracts of Articles in Volume 64 of the IBM Journal of Research and Development published February 2020:


  • Disaster management in the digital age - J. W. Talley
    Abstract: The United States is one of the most natural disaster-prone countries in the world. Since 1980, there have been 246 weather and climate disasters exceeding $1.6 trillion in remediation. Within the last decade, the frequency of disaster events and their costs are on the rise. Complicating the impact of natural disasters is the population shift to cities and coastal areas, which concentrate their effects. The need for governments and communities to prepare for, respond to, and recover from disasters is greater than ever before. Disaster management is a big data problem that requires a public private partnership solution. Technology is the connection that can link end-to-end capabilities across multiple organizations for disaster management in the digital age. But how can technologies like cloud, artificial intelligence (AI), and predictive analytics be leveraged across all aspects of the disaster management life cycle? This article briefly addresses these questions and more. Two case studies and technology spotlights are used to reinforce discussion around traditional and new approaches to the management of natural disasters.

  • A unique approach to corporate disaster philanthropy focused on delivering technology and expertise - R. E. Curzon ; P. Curotto ; M. Evason ; A. Failla ; P. Kusterer ; A. Ogawa ; J. Paraszczak ; S. Raghavan

    Abstract: The role of corporations and their corporate social responsibility (CSR)-related response to disasters in support of their communities has not been extensively documented; thus, this article attempts to explain the role that one corporation, IBM, has played in disaster response and how it has used IBM and open-source technologies to deal with a broad range of disasters. These technologies range from advanced seismic monitoring and flood management to predicting and improving refugee flows. The article outlines various principles that have guided IBM in shaping its disaster response and provides some insights into various sources of useful data and applications that can be used in these critical situations. It also details one example of an emerging technology that is being used in these efforts.

  • Quantitative modeling in disaster management: A literature review A. E. Baxter ; H. E. Wilborn Lagerman ; P. Keskinocak 

    Abstract: The number, magnitude, complexity, and impact of natural disasters have been steadily increasing in various parts of the world. When preparing for, responding to, and recovering from a disaster, multiple organizations make decisions and take actions considering the needs, available resources, and priorities of the affected communities, emergency supply chains, and infrastructures. Most of the prior research focuses on decision-making for independent systems (e.g., single critical infrastructure networks or distinct relief resources). An emerging research area extends the focus to interdependent systems (i.e., multiple dependent networks or resources). In this article, we survey the literature on modeling approaches for disaster management problems on independent systems, discuss some recent work on problems involving demand, resource, and/or network interdependencies, and offer future research directions to add to this growing research area.

  • Call for Code: Developers tackle natural disasters with software  - 

    Abstract:  Natural disasters are increasing as highlighted in many reports including the Borgen Project. In 2018, David Clark Cause as creator and IBM as founding partner, in partnership with the United Nations Human Rights Office, the American Red Cross International Team, and The Linux Foundation, issued a “Call for Code” to developers to create robust projects that prepare communities for natural disasters and help them respond more quickly in their aftermath. This article covers the steps and tools used to engage with developers, the results from the first of five competitions to be run by the Call for Code Global Initiative over five years, and how the winners were selected. Insights from the mobilization of 100,000 developers toward this cause are described, as well as the lessons learned from running large-scale hackathons.

  • Next-generation geospatial-temporal information technologies for disaster managementC. M. Albrecht ; B. Elmegreen ; O. Gunawan ; H. F. Hamann ; L. J. Klein ; S. Lu ; F. Mariano ; C. Siebenschuh ; J. Schmude

    Abstract: Traditional geographic information systems (GIS) have been disrupted by the emergence of Big Data in the form of geo-coded raster, vector, and time-series Internet-of-Things data. This article discusses the application of new scalable technologies that go far beyond relational databases and file-based storage on spinning disk or tape to incorporate both storage and processing data in the same platform. The roles of the Apache Hadoop Distributed File Systems and NoSQL key-value stores such as the Apache Hbase are discussed, along with indexing schemes that optimally support geospatial-temporal use. We highlight how this new approach can rapidly search multiple GIS data layers to obtain insights in the context of early warning, impact evaluation, response, and recovery to earthquake and wildfire disasters.

  • Predicting impacts of weather-driven urban disasters in the current and future climateL. Treinish ; A. Praino ; M. Tewari ; B. Hertell

    Abstract: Effective city operations depend on local weather conditions at the scale of critical urban infrastructure such as power and water distribution systems. This includes both routine and severe weather events. For example, with precipitation events, local topography and weather influence water runoff and infiltration, which directly affect flooding as well as drinking water quality and availability. The impact of such events creates issues of public safety. Thus, the availability of highly localized weather model predictions focused on public safety and operations of infrastructure can mitigate the impact of severe weather. This is especially true if the lead time for the availability of such predictions enables proactive allocation and deployment of resources to minimize recovery time from severe events. Typically, information at such a scale is simply not available. Hence, the ability of municipalities to proactively respond to these events is limited. Available continental- or regional-scale weather models are not appropriately matched to the temporal or spatial scale of such operations. While near-real-time assessment of observations of current weather conditions may have the appropriate geographic locality, by its very nature it is only directly suitable for reactive response. To address this gap, we use state-of-the-art physical weather models at the spatial scale of the city's infrastructure to avoid this mismatch in predictability. Model results are coupled to data-driven stochastic models to represent the actionable prediction of weather (business) impacts. In some cases, an intermediate physical model may be required to translate predicted weather into the phenomena that lead to such impacts. We have applied these ideas to several cities with a diversity of impacts and weather concerns and show how this coupled model methodology enables prediction of storm impacts on local infrastructure. We also discuss how this concept can be extended to a climate scale in order to evaluate the potential localized impacts of a warming planet and the effectiveness of strategies being used to mitigate such impacts.

  • A machine learning approach to scenario analysis and forecasting of mixed migrationR. Nair ; B. S. Madsen ; H. Lassen ; S. Baduk ; S. Nagarajan ; L. H. Mogensen ; R. Novack ; R. Curzon ; J. Paraszczak ; S. Urbak

    Abstract:  The development of MM4SIGHT, a machine learning system that enables annual forecasts of mixed-migration flows, is presented. Mixed migration refers to cross-border movements of people that are motivated by a multiplicity of factors to move including refugees fleeing persecution and conflict, victims of trafficking, and people seeking better lives and opportunity. Such populations have a range of legal status, some of which are not reflected in official government statistics. The system combines institutional estimates of migration along with in-person monitoring surveys to establish a migration volume baseline. The surveys reveal clusters of migratory drivers of populations on the move. Given macrolevel indicators that reflect migratory drivers found in the surveys, we develop an ensemble model to determine the volume of migration between source and host country along with uncertainty bounds. Using more than 80 macroindicators, we present results from a case study of migratory flows from Ethiopia to six countries. Our evaluations show error rates for annual forecasts to be within a few thousand persons per year for most destinations.

  • Emergencies do not stop at night: Advanced analysis of displacement based on satellite-derived nighttime light observationsM. Enenkel ; R. M. Shrestha ; E. Stokes ; M. Román ; Z. Wang ; M. T. M. Espinosa ; I. Hajzmanova ; J. Ginnetti ; P. Vinck

    Abstract:  Around 68.5 million people are currently forcibly displaced. The implementation and monitoring of international agreements, which are linked to the 2030 agenda (e.g., the Sendai Framework), require a standard set of metrics for internal displacement. Since nationally owned, validated, and credible data are difficult to obtain, new approaches are needed. This article aims to support the monitoring of displacement via satellite-derived observations of nighttime lights (NTL) from NASA's Black Marble product suite along with an short message service (SMS)-based emergency survey after Cyclone Idai had made landfall in Beira, Mozambique, in March 2019. Under certain conditions, the spatial extent of power outages can serve as a proxy for disaster impacts and a potential driver for displacement. Hence, information about anomalies in NTL has the potential to support humanitarian decision-making via estimations of people affected or the coordination of rapid response teams. Despite initial issues related to cloud cover, we find that around 90% of Beira's power grid had been affected. In collaboration with the Internal Displacement Monitoring Center, we use these findings to establish a framework that links NTL observations with existing humanitarian decision-making workflows to complement ground-based survey data and other satellite-derived information, such as flood or damage maps.

    Improving humanitarian needs assessments through natural language processingT. Kreutzer ; P. Vinck ; P. N. Pham ; A. An ; L. Appel ; E. DeLuca ; G. Tang ; M. Alzghool ; K. Hachhethu ; B. Morris ; S. L. Walton-Ellery ; J. Crowley ; J. Orbinski

    Abstract: An effective response to humanitarian crises relies on detailed information about the needs of the affected population. Current assessment approaches often require interviewers to convert complex, open-ended responses into simplified quantitative data. More nuanced insights require the use of qualitative methods, but proper transcription and manual coding are hard to conduct rapidly and at scale during a crisis. Natural language processing (NLP), a type of artificial intelligence, may provide potentially important new opportunities to capture qualitative data from voice responses and analyze it for relevant content to better inform more effective and rapid humanitarian assistance operational decisions. This article provides an overview of how NLP can be used to transcribe, translate, and analyze large sets of qualitative responses with a view to improving the quality and effectiveness of humanitarian assistance. We describe the practical and ethical challenges of building on the diffusion of digital data collection platforms and introducing this new technology to the humanitarian context. Finally, we provide an overview of the principles that should be used to anticipate and mitigate risks

  • Understanding requirements and issues in disaster area using geotemporal visualization of Twitter analysisA. Murakami ; T. Nasukawa ; K. Watanabe ; M. Hatayama 
  • Abstract: During disasters, requirements and situations on the ground change very rapidly. Moreover, they depend on timing and location; thus, it is very hard to understand them in a timely manner. Social media may contain such information with the posted time and the location information. However, it is difficult to extract situational requirements from numbers of conflicting sources. In this article, we propose a system that enables us to find out such useful information from social media and visualize it to understand the data easily. The system is divided into two steps. The first step is to extract requirements and issues from textual data, such as “We cannot buy gas here” or “We are short of batteries,” using natural language processing (NLP) technologies. The system also uses NLP to extract geolocation information, such as city names and location landmarks. The second step is to visualize the results in a timely and geolocated manner. We show the system results with using real Twitter data from the Kumamoto Earthquake in 2016. By visualizing the information, the personnel in the disaster area, such as the local governments and/or volunteer organizations, can utilize this information very effectively. For instance, they can decide how to distribute food and water in the disaster area and also how to implement and responsed to their logistics.

  • Culture and cognition: Understanding public perceptions of risk and (in)action -  T. Allen ; E. Wells ; K. Klima
    Abstract:  Much is known about the effects of risk on behavior and communication, yet little research has considered how these risks influence modes of cultural and cognitive processing dynamics that underlie public perceptions, communications, and social (in)action. This article presents a psychological model of risk communications that demonstrates how cognitive structure, cultural schema, and environment awareness could be combined to improve risk communication. We illustrate the explanatory value of the model's usefulness on two qualitative case studies: one on decision-makers facing extreme heat, and another on homeowners facing flood events. Consistent with the model predictions, we find that cognitive structure, cultural schema, and environment awareness dynamics are not only necessary determinants to strengthen risk communications, but also important for understanding perceptions of risk and people's (in)action to engage in mitigation and adoption efforts. This suggests that decision-makers hoping to reduce disaster risk or improve disaster resilience may wish to consider how these three dynamics exist and interact.

  • Quantifying supply chain network synergy for humanitarian organizations - A. Nagurney ; Q. Qiang

    Abstract: Both the number of disasters and the number of people affected by disasters are growing, creating a great need for resilient disaster management. In this article, we construct multiproduct supply chain network models for multiple humanitarian organizations. The models capture uncertainty associated with costs of their supply chain activities, including procurement, storage, and distribution, under multiple disaster scenarios, along with uncertainty associated with the demand for the disaster relief products at the demand points. The models reflect the organizations’ operations, without and with cooperation, with the humanitarian organizations seeking to determine the disaster relief multiproduct flows that minimize their expected total cost and risk, subject to expected demand satisfaction. We utilize a mean-variance approach to capture the risk associated with cost uncertainty and propose a synergy measure for the assessment of the potential strategic advantages of cooperation for resilient disaster management. We also identify the role of technology in helping to parameterize the models and illustrate the analytical framework with numerical examples, accompanied by managerial insights.

  • Online optimization of first-responder routes in disaster response logistics -

    Abstract: After a disaster, first responders should reach critical locations in the disaster-affected region in the shortest time. However, road network edges can be damaged or blocked by debris. Since response time is crucial, relief operations may start before knowing which edges are blocked. A blocked edge is revealed online when it is visited at one of its end-nodes. Multiple first-responder teams, who can communicate the blockage information, gather initially at an origin node and are assigned to target destinations (nodes) in the disaster-affected area. We consider multiple teams assigned to one destination. The objective is to find an online travel plan such that at least one of the teams finds a route from the origin to the destination in minimum time. This problem is known as the online multi-agent Canadian traveler problem. We develop an effective online heuristic policy and test it on real city road networks as well as randomly generated networks leading to instances with multiple blockages. We compare the performance of the online strategy with the offline optimum and obtain an average competitive ratio of 1.164 over 70,100 instances with varying parameter values.

  • Deep analytics for workplace risk and disaster managementS. Dalal ; D. Bassu

    Abstract: We discuss dynamic real-time analysis from multimodal data fusion for contextual risk identification to generate “risk maps” for the workplace, resulting in timely identification of hazards and associated risk mitigation. It includes new machine/deep learning, analytics, methods, and its applications that deal with the unconventional data collected from pictures, videos, documents, mobile apps, sensors/Internet of Things, Occupational Safety and Health Administration (OSHA) rules, and Building Information Model (BIM) Models. Specifically, we describe a number of advances and challenges in this field with applications of computer vision, natural language processing, and sensor data analysis. Applications include automated cause identification, damage prevention, and disaster recovery using current and historical claims data and other public data. The methods developed can be applied to any given situation with different groups of people, including first responders. Finally, we discuss some of the important nontechnical challenges related to business practicality, privacy, and industry regulations.

  • Elderly care through unusual behavior detection: A disaster management approach using IoT and intelligence -  
  • Abstract: This article attempts to provide a minimal disaster management framework for the elderly who are living alone. Elderly people are generally vulnerable to hazards and emergency situations. The proposed framework aims at developing an Internet of Things (IoT)-based intelligent, protective ecosystem for elderly that calls for help in emergencies such as floods, earthquakes, home fires, volcanic eruptions, and storms. This disaster system makes use of a range of calamity sensors in conjunction with in-house activities of the elderly subject. In the case of mishappening, the disaster relief authorities, community members, and other stakeholders will be instantly informed. All these sensors are powered through an uninterrupted electricity supply system, which will continue to work even in the case of power outages. The work carried out in this article is deeply inspired by the need to have holistic platforms that ensure a low-cost, robust, and responsive disaster alert system for the elderly (DASE) in place. Our objective is to overcome many of the shortcomings in the existing systems and offer a reactive disaster recovery technique. Additionally, this article also incorporates the need to take care of numerous important factors, for instance, the elderly individual's limited physical and cognitive limitations, ergonomic requirements, spending capacity, etc.

  • Creating a water risk index to improve community resilienceK. Klima ; L. El Gammal ; W. Kong ; D. Prosdocimi

    Abstract: Flood risk reduction is an existent discourse and agenda in policy and insurance. Existing approaches such as linking hydrological models to economic loss models may be highly inequitable between areas of different socio-economic vulnerability. To our knowledge, no one has tried to adapt the more advanced known heat risk theory by first informing flood risk with the socio-economic vulnerability, and then investigating the sensitivity of risk reduction policies to that flood risk. In this article, we demonstrate two methods to combine water hazard data with a derived water vulnerability index to characterize water risk. We then compare the costs of two potential government policies: buyout of the home versus funding for foundation elevation. We use the case study area of Pittsburgh, PA, which faces severe precipitation and riverine flooding hazards. We find that while small differences in characterizing flood risk can result in large differences between flood risk maps, the cost of the flood risk reduction policy is not sensitive to the method of representing the socio-economic vulnerability. This suggests that while validation of flood risk incorporating socio-economic data is needed, for some policies, policymakers can prioritize environmental justice with little to no additional cost.



#GlobalAIandDataScience
#GlobalDataScience
#Highlights-home
0 comments
1902 views

Permalink