Reallocate your resources safely, quickly, and with no money wasted
Yana Ageeva*, Senior Data Scientist and AI Evangelist, IBM Data Science and AI Elite Team
Anthony Casaletto*, Program Director, IBM Data Science and AI Elite Team

Photo by Edwin Hooper on Unsplash
Introducing IBM Industry Accelerators
The IBM Data Science and AI Elite (DSE) team was created over two years ago to work with clients in every industry to help them bring value to all aspects of the business by harnessing data science and AI. After over 160 engagements with clients worldwide, the DSE team created templated packages for some of the top use cases based on learnings from these engagements. We call these Industry Accelerators.
The accelerators are great learning assets, but they are so much more. They are usable components that help kickstart an implementation project, enabling an organization to implement the solution on their own data and get to productive use in an accelerated timeframe.
The Emergency Response accelerator is designed to address a wide range of audiences, from executive decision makers to data scientists to application developers. Glossaries and Terms provide the information architecture that you need to effectively catalog and analyze your data. The data science project includes assets covering data wrangling, machine learning models and decision optimization enabling the data scientists on the team to collaborate and extend the template models based on available data. The asset covers how to operationalize the implemented data science models, including a sample application, which demonstrates how the deployed models can be used embedded within a business workflow by the application developer.
What is Emergency Response?
At the time of writing, we are deep into the COVID-19 pandemic. Hospitals, supermarkets, and government organizations alike are overwhelmed by the massive demand for services, while trying to manage their limited resources and assets, at the same time struggling to keep people safe. How should resources be allocated now so that they are exactly where patients and doctors need them the most? How can ventilators be re-allocated over time based on evolving demand and geographical location, while subject to safety regulations and travel constraints? How do we safely return to work once things start getting back to “normal”, whatever this might mean?
Due to its unprecedented nature, automating decisions around an event like the COVID-19 pandemic is complex and requires careful analysis, accurate data that may not be readily available, not to mention any political and administrative considerations.
This real and large-scale disruption is what’s consuming the world at this moment, yet it is only one example of an emergency that requires a speedy and accurate response, managing limited resources, subject to making sure operational costs do not skyrocket. As we will see, efficient emergency response requires a combination of AI technologies, in particular machine learning and decision optimization.
The methodology discussed in this article can be applied to address various types of large-scale disruptions. We will specifically focus on weather emergencies, such as snowstorms, as their impact can be devastating. According to the National Centers of Environmental Information and NOAA Climate.gov, the U.S. has sustained 258 weather and climate disasters since 1980. The cumulative cost for these 258 events exceeds $1.75 trillion.
Example: Responding to a massive weather event

Photo by Erik Mclean on Unsplash
When a severe weather event is about to strike, be it a snowstorm, a flood, or an earthquake, an operations manager for a state or region must quickly decide how to best meet the demand for resources and assets such as snowplows, water, police, and rescue vehicles.
Let’s look specifically at responding to snowstorms. The demand for snowplows depends on the snowfall forecast, and some regions may be more heavily impacted than others, requiring more snowplows than are available when the storm strikes. A machine learning model can predict demand for trucks in each county, taking into account all relevant information, such as the snowfall forecast, safety guidelines, and past snowstorm history (Figure 1).
First things first — we need an accurate demand forecast
Figure 1: Machine Learning predicts demand for snowplows in each county in each time period
We then need to decide how to reallocate these assets from regions with low impact to those highly impacted, e.g. how many to move from counties C1 and C2 to C3, C4, and C5, making sure there are enough snowplows to cover the demand in each county.
However, let’s not forget that the snowfall forecast will change every few hours, as the snowstorm moves through the state, so we need to keep moving the trucks around, taking into account the weather forecast and the level of backlog at the time (accumulation of snow).
We need an efficient way to reallocate snowplows and other assets
Why predicting demand for resources is key but not enough to solve the emergency response problem?
Predictions generated by our machine learning model may be sufficient to make a decision if we consider one county at a time and there are no conflicting priorities. However, once we start introducing some real-life complexity, dependencies, and limited assets, determining the optimal reallocation plan becomes more challenging.
For example, what if two counties on the opposite sides of the state are expected to receive the most snowfall in the beginning of the storm? In this case, where do we send the snowplows and how many, if we don’t have enough to cover all demand?
A forecast doesn’t tell us how to allocate limited resources
What about a situation in which county C1 needs resources the most at the beginning of the storm, but a few hours later the resources are needed in C2 and C3 on the other side of the state? Do we move the snowplows to C1, even if it makes it impossible to get them to C2 and C3 before they get hit by the storm? How would this delay affect the backlog of snow and road safety?
The decisions we make are interdependent, therefore we must make them in conjunction with each other
To create a feasible plan, the decisions we make about how to allocate assets in the beginning of the storm need to be made in conjunction with the reallocation decisions later on during the storm. Additionally, we must not make decisions about one county in isolation from the decisions for other counties, as they all compete for the same limited resources.
Finally, how many decisions are there? As an example, let us consider 2,000 snowplows and 60 counties. Let’s say we consider reallocating them every 4 hours during a 24-hour snowstorm. How many reallocation possibilities are there? In any one of 6 time periods we can decide to move 0–2,000 snowplows from any one of 60 counties to any other one of 60 counties. Taking a very rough estimate without considering distance, availability, snow backlog, demand constraints, and multiple types of assets, the number of options we may need to consider is 60*60*2,000*6 = 43,200,000! No human can possibly determine manually which one of these decisions is optimal.
A decision involving reallocation can easily require evaluation of 43,000,000+ options! No human can possibly determine manually which one of these options is best.
How can Decision Optimization help?

Decision Optimization uses insights from Machine Learning to make optimal decisions
Decision optimization uses very specialized algorithms and techniques to efficiently search through and evaluate millions of potential solutions, without enumerating each one of them.
While machine learning can take into account all available data and past history to predict the demand for snowplows and other resources in each county or region at any given time, decision optimization (DO) can take it a step further and generate a plan that is optimal for the entire state over the entire planning horizon, subject to limited resources (e.g. snowplows), other constraints and dependencies (available types and quantity of resources, current location of each snowplow, travel distance and time), and optimization metrics (minimizing total cost, maximizing safety/customer satisfaction, minimizing total truck travel distance). Not only does it offer us valuable insights, but it also generates an actionable schedule or plan (Figure 2).

Figure 2: Combining the Power of Machine Learning & Decision Optimization for Emergency Response
As is the case with any decision support solution, it is not enough to create a model. Our ultimate goal should be a solution delivered in the hands of business users and decision makers. There are several key pieces required to build such a solution in general (Figure 3):
- a powerful mathematical optimization engine, such as CPLEX, that can run and solve decision optimization models, finding an optimalsolution
- an efficient modeling environment to build AI models (machine learning and decision optimization)
- what-if scenario analysis and dashboards to test models, analyze scenarios, and prototype visualization for business users
- a mechanism to deploy the models as web services to be embedded in your decision support application so that planners can run scenarios in real time

Figure 3: Decision Optimization application development and deployment using IBM Cloud Pak for Data
More specifically, applying the above to our specific example of implementing an emergency response planning solution, the process can be summarized in the following three steps:
Step 1: Build a machine learning model to predict demand for resources (snowplows) by location and time period.
Step 2: Build a decision optimization model to determine how to best reallocate resources, based on demand and availability, before and during the snowstorm. The model translates the business requirements into terms that can be consumed by a mathematical optimization engine.
Step 3: Deploy the models and embed them in your planning application
Let’s take a look at these three steps in more detail. No matter what technology is being used, the methodology will be similar. Here, as an example, we outline how IBM Cloud Pak for Data can be used to build an Emergency Response solution.
Step 0: Gather and prepare available data
Gather all required data. Glossaries and Terms provide the information architecture that is needed to effectively catalog and analyze your data. IBM Cloud Pak for Data enables the data transformation step through a variety of tools including open-source scripts or visual drag-and-drop tooling.
Step 1: Build a machine learning model to predict demand
This can be done in a variety of ways, depending on your preferences. IBM Cloud Pak for Data includes AutoAI, a tool that can automatically select the right algorithms, build machine learning pipelines, perform hyperparameter optimization (HPO) and feature selection, and identify the best model based on the specified evaluation metrics. For this accelerator AutoAI turned out to be a perfect fit as it allowed us to significantly cut down on the development time (Figure 4). Another option is to use the SPSS Modeler’s drag-and-drop interface to create your own machine learning pipeline. If you prefer to build your models from scratch, you can simply implement them in R or Python using open-source packages such as scikit-learn. Once the model has been trained, validated, and tested, it can be deployed as an online model to be accessed using REST APIs.
Figure 4: AutoAI experience: automatically generating the best model to predict demand for snowplows
Step 2: Build a decision optimization model to reallocate resources
The key elements of a decision optimization model are decision variables, optimization metrics, and constraints.
The decisions that need to be made (modeled as decision variables) determine the type and quantity of assets (e.g. snowplows) to reallocate between every pair of counties during each time period. These values are not part of our input data and will instead be automatically determined by the optimization engine.
Optimization metrics define what we are optimizing for (minimizing/ maximizing). In case of emergency response these metrics could be any combination of the following:
- minimize the cost of reallocating trucks (based on the total number of reallocations, as well as the variable cost based on miles traveled)
- minimize unplowed snow
- maximize safety
- maximize customer/commuter satisfaction
Finally, the optimization model will need to take into account a number of constraints, including the following:
- any existing snow backlog and plowing capacity of the assets available in a location at a given time period is balanced against the predicted demand for snowplows (output of the ML model)
- any reallocation decisions must be feasible, i.e. we are not moving more trucks than are available at any given time and location
Decision Optimization dashboards display inputs and outputs of our DO model. The key inputs are the demand prediction and initial allocation of snowplows (Figure 5):

Figure 5: Decision Optimization Dashboard scenario inputs: predicted demand and initial allocation of snowplows
Here we see the demand for snowplows by time and location (output of the ML model), and the current distribution of assets, i.e. quantity by location.
We then create three different what-if scenarios:
- Scenario S1 (Baseline Scenario): does not perform any reallocation (no optimization, compute the KPIs only)
- Scenario S2 (Reallocate Before Storm): reallocate assets once in preparation for the snowstorm (partial optimization)
- Scenario S3 (Reallocate During Storm): reallocate assets every few hours as the snowstorm progresses (full optimization)
After solving these three scenarios we compare the results side-by-side in the DO Dashboard (Figure 6):

Figure 6: Decision Optimization Dashboard: what-if analysis
As we can see clearly, the most optimized scenario, Scenario S3, results in the smallest backlog (1/4 of the baseline Scenario S1 without optimization, or about 1/2 of the partially optimized Scenario S2). This, of course, comes at a cost that needs to be evaluated. Scenario S3 is the costliest ($176,324) due to the highest number of reallocations, vs. $51,061 for Scenario S2 and $0 for Scenario S1 (no reallocations).
Comparing the three scenarios, the planner would want to evaluate the benefit of smaller snow backlog vs. higher reallocation cost. If desired, the total cost could also be presented as a weighted sum of the two optimization metrics used here, i.e. a combination of total backlog and reallocation cost for each scenario. The planner may also want to run a couple additional scenarios before making the final decision, e.g. she could see if borrowing a few snowplows from a nearby state could result in a significantly better solution.
Step 3: Deploy the models and embed them in your planning application
Once the models are ready and tested, they can be deployed as web services and embedded into our planning application. When using IBM Cloud Pak for Data, deploying models in Watson Machine Learning is a matter of a couple of clicks. Each model then becomes available through a REST API endpoint. The final step then is to build and deploy a web application (e.g. Node.js or R Shiny), or embed services in an existing one, and generate our optimized reallocation plan by accessing the deployed ML and DO models.
This demo was implemented using an R Shiny app (Figure 7):
Figure 7: Sample Emergency Response planning application infused with ML and DO models
When Alex, the snow operations manager, logs into her emergency planning portal, she needs all essential information at her fingertips to make decisions. She looks over New York state and quickly assesses the snowfall forecast, updated in real time and indicated by the color on the map. The darker the color, the more snowfall the corresponding county is expecting. The slider in the top right corner can be used to select a different time period to see how the situation changes over time.
Alex notices that Chautauqua is expecting the most snow at the beginning of the storm, selects the county on the map to view more detailed information, such as the exact snow inches, the number of snowplows available vs. the predicted demand, and the backlog expected. The predicted demand is obtained in real time from our deployed ML model.
Alex then selects one of the incoming arrows and observes that 20 trucks are being reallocated from Livingston county. This makes perfect sense as the snowfall is minimal there for now.
Alex moves the time slider to the right, to see the recommended truck reallocations which correspond to changing demand due to the snowstorm moving across the state. The same truck reallocation recommendations are also available in the table below. They were obtained in real time using our deployed decision optimization model.
If Alex is happy with the recommended plan, she can go ahead and approve it. She might also choose to run some what-if scenarios, experimenting with potentially making a few additional snowplows available or specifying some of the reallocations manually. After solving these additional scenarios and comparing the resulting moves and the corresponding KPIs, Alex will make her final decision on how to proceed…
Of course, this is just a simple example of a demo implemented using IBM Cloud Pak for Data and R Shiny. The possibilities are endless as far as what you can do and how you can present information in your own app.
Summary
Responding to large-scale emergencies and disruptions is a key issue and requires immediate action. The benefits of using an effective emergency response solution can be significant and include but are not limited to:
- Reduced planning/scheduling time and effort
- Improved safety and regulatory compliance
- Lower cost of operations
- Improved resource utilization
- Improved customer satisfaction
Creating an optimal asset reallocation plan is a challenging problem that is best tackled using the combined power of machine learning and decision optimization. While machine learning can take into account all available data and past history to predict the demand for resources for each location at a given time, decision optimization can take it a step further and generate a plan that is optimal for a set of locations, subject to limited resources, other constraints and dependencies, and optimization metrics. Not only does optimization offer valuable insights, but it also generates an actionable schedule or plan.
Effectively responding to large-scale disruptions requires effective decision-making tools for business users. Even if we build great AI models, their power can only be leveraged by putting them in the hands of planners and decision makers. Applications infused with embedded AI models allow decision makers to experiment with various scenarios and compare the benefits side by side before implementing their final decisions, all this without any need to understand the implementation details behind the AI models.
There are many Industry Accelerators today and more coming throughout the year. The best part: these are absolutely FREE and available for your consumption on the IBM Data Science Community .
Industry Accelerators run on the IBM Cloud Pak for Data platform. To find out more about the capabilities of the platform and to start a free trial, visit: https://www.ibm.com/products/cloud-pak-for-data
Interested in learning how to kick-start your data science project with the right expertise, tools and resources? The DSE team can plan, co-create and prove the project with you based on our proven Agile AI methodology.
Request a free consultation: ibm.co/DSE-Consultation
Visit ibm.co/DSE-Community to connect with us, explore our resources and learn more about Data Science and AI Elite.
*Credits
This story wouldn’t have been possible without the hard work of several colleagues and team members who are part of the IBM Data Science Elite team and most of whom worked on the original client engagement that this accelerator is based on. They are, in alphabetical order:
@Aakanksha Joshi
@Aleksandr Petrov
@David Thomason
@Elie Paul
@Rakshith Dasenahalli Lingaraju
@Tim Bohn
@Victor Terpstra
Vinay Rao
@Wanting Wang
#GlobalAIandDataScience#GlobalDataScience#Highlights-home#News-DS