By themselves, the numbers associated with Hurricane Florence are mind-boggling:
- Forecasters estimated that storm dumped “about 18 trillion gallons of rain over a week over North Carolina, South Carolina, Virginia, Georgia, Tennessee, Kentucky and Maryland.” That alone is enough to cover all of Manhattan by 3,800 feet of water and more water than what is in the Chesapeake Bay.
- In terms of insured losses, Florence is estimated to cost $2.5 billion – and remember that does not include persons WITHOUT flood insurance.
- Over a million people lost power and some will be without it for weeks.
- At least 2,600 swift-water rescues were made by first responders.
- Record numbers of poultry and swine – two of the largest agricultural products of the region – were lost, costing farmers millions of dollars.
- And that is all from JUST ONE HURRICANE.
Year after year, the costs of disasters – fire, flood, tornado or even volcano – keep going up. In 2017, the U.S. experienced more than $300 billion in weather-related disasters. 2018 is already challenging that record, and it’s only September.
That says nothing for the costs and complexity of responding to each of those incidents or the efforts to mitigate against future risks from Mother Nature’s wrath. Ownership of those tasks and responsibilities resides with FEMA and, while they are inundated with multitudes of hazards, they also have phenomenal amounts of data to help them operate in this mission space.
But too much of any one thing, be it fire, water, wind, or even data, can become a bad thing. All of those elements become destructive, all-consuming and overwhelming, but if any of them are honed, disciplined and channeled they can become an incredible tool to improve lives and decision-making.
The recently released FEMA strategic plan provides the agency the necessary honing, discipline and channeling if it is fully implemented and executed. While previous versions of FEMA’s strategic plans have called for better use of data to inform decisions, this latest take is far more prescriptive in its assigned actions, defined owners, and goal expectations. Having leadership that believes and acts on that promise helps a lot, too. Almost from the beginning of his tenure as FEMA administrator, Brock Long has passionately described the need for data-driven recovery as well as data-driven investments in operations, mitigation and grant dollar awards.
Adding to the preparedness and recovery mix are the vigorous public education efforts by FEMA Deputy Administrator for Resilience Dan Kaniewski, encouraging people to increase their insurance coverage (especially flood insurance) to recover faster and more wholly.
Both leaders and all of their FEMA counterparts are all about the mission of helping people in any way they can, but they are also all about getting more prepared for the next time they are called upon to render aid. Increased preparedness is only a result of data being an asset and a priority for FEMA’s operational menu. It is also important to note that all of the data FEMA collects about the events it responds to, plans for, invests in, and forecasts is about ultimately about one thing: RISK.
That powerful four-letter word can and does shape everything the agency does every day. That’s part of the reason the “why” and “how” FEMA manages, collects, accesses, shares, and utilizes data in all its forms is so important.
If you speak to anyone at FEMA about any of the deployments that they’ve been on, you will always hear spectacular stories about what they saw and experienced, who they worked with, the relationships that were built or frayed and, of course, the lessons learned. While experiences are valuable in shaping every one of them as professionals and as teammates, without data to inform them, FEMA would be reliant only on history, habits and instincts to do their jobs. Those are certainly great traits to possess and refine, but they will never give you the full perspective that data and analytics can offer.
The increase of data has created FEMA’s ever-growing basket of challenge of managing and navigating through all of the data that comes its way. When your partners are literally everyone (e.g. federal, state, local, tribal governments, NGOs, private-sector members, etc.), it is important to recognize that they all run their business their own way. The same holds true for data collection – everyone has their own format, preference and means of collecting and deciphering it. Despite the imposition of standards (which would provide some much-needed discipline and structure to better enable and enhance data and analytic processes), FEMA is never going to get all partners to operate in the same manner. The only remedy open for all of this diversity is having an enterprise framework capable of translating all of those details into actionable intelligence for the network.
Having as much information as possible can and always will be of assistance to decision-makers at any stage in the response and recovery process. While these details will never make an event free from politics or outside interference, they provide facts and context to situations when emotions can be at their highest. Reducing the number of decisions made in any fog-of-war circumstance is paramount. Data and analytics hold that promise for FEMA, key stakeholders and affected communities.
Having all that information easily accessible and usable also makes any after-action reports as well as real-time performance assessments for emergency managers, Congress, OMB, government auditors, the media and taxpayers much more accurate and accountable. That’s a resource that not only works for managing FEMA, but also for every one of its partners that needs to better understand what is happening in planning, preparing, responding and mitigating for disasters big and small and in the before, during and after stages. There are lots of amazing people in the emergency management community who can take in a lot of those details and be able to use them accordingly, but only the technological solutions that come with data and analytics can absorb it all.
The same holds true with watching social media during a disaster. FEMA and much of the emergency management community have enthusiastically embraced social media to better understand what is happening and being said during an emergency. The recent successes by the U.S. Coast Guard, as well as the Cajun Navy, in monitoring social media feeds have led to the successful rescue of hundreds of people during hurricanes Florence and Harvey. There is positive power and potential that comes from keeping an eye on the digital world.
As much as social media offers real-time intelligence and situational awareness on disaster needs, it can also be where myths, mistruths and disinformation are shared and widely distributed. That only makes response and recovery operations even more difficult for the people trying to make things right. FEMA and its partners have aggressively worked to combat these efforts, but with information changing all the time the agency and its teammates need every possible technological resource to aid its efforts.
There are critics that are quick to condemn any type of government monitoring of social media as “Orwellian” and “Big Brother” in action. While it’s healthy and warranted to be suspicious of government listening in on what the public is saying, wholescale restrictions from letting it hear, see and assess what citizenry are saying in times of greatest need are shortsighted. As a populace, we can’t expect FEMA to respond to our needs if we prevent them from hearing, seeing and analyzing what we’re saying and doing. Effective privacy policies and civil-rights protections are fundamental to those efforts.
There are more rewards to using data and analytics at FEMA beyond establishing a wider and more focused common operating picture, creating more timely and accurate reports, and understanding what the public is saying on Twitter, Instagram and Facebook during emergencies. Applying data and analytics to enable mitigation planning and investments; review grant performance and ROI; obtain a 360-degree view of disaster victims and affected communities to better understand specific needs during response and recovery efforts; inform and shape better policies, operations and logistics to be in place for that “next time”; modeling prospective costs and impacts in various disaster scenarios; and detecting fraud and identity theft in disaster assistance programs are real-time and real-world applications available today.
There is nothing that FEMA does that cannot be enhanced by the application of data and analytic tools. Driving and implementing those solutions will take FEMA and its emergency management partners to that “next level” of public service and engagement. The only thing missing from FEMA’s Strategic Plan is having a chief data officer to marshal forces to help the agency on that unfolding path, but at least they have the leadership at the top of the organization empowering them to get there. And that’s a good start.
The views expressed here are the writer’s and are not necessarily endorsed by Homeland Security Today, which welcomes a broad range of viewpoints in support of securing our homeland. To submit a piece for consideration, email [email protected]. Our editorial guidelines can be found here.