71.3 F
Washington D.C.
Wednesday, April 24, 2024

HSTRisk: How to Build a Solid Business Case for Your Cloud Smart Migration

Any agency looking to update its cloud program in 2020 will want to pay attention to the finalized Cloud Smart policy. Cloud Smart builds on the 2010 Cloud First policy and brings much-needed clarity about what counts as cloud services and how to plan for cloud migration.

A lot of private-sector organizations are paying Big Four consulting firms to assist them in conducting “application rationalization” projects, essentially a process of helping these organizations better understand which applications they need (they often have duplication), where these applications need to exist (on premise or in the cloud), and what the proper order is for migration. To jump-start the federal agencies’ work in this space, the Federal CIO Council has prepared the Application Rationalization Playbook. This document gives a detailed process workflow with additional guidance for agencies working toward compliance with the Cloud Smart policy.

The CIO Council built a repeatable framework for developing a custom plan for agencies looking to move applications to the cloud. It focuses on developing the scope and governance around the process, building a good application inventory, assessing business value and technical fit, looking at the total cost of ownership, scoring the applications using these values, then determining the placement on-prem or in the cloud.

Inventory is a critical part of this process, as you have to know the universe of applications that an agency has, but once that inventory is compiled the third step in the process, what the Council calls Assess Business Value and Technical Fit, has a unique note about risk that has a large impact on the overall process.

The Council’s guidance says that at least the following factors need to be considered when assessing business value: effectiveness, mission criticality, utilization, complexity, and usability. Under technical fit is another list of five elements, one of which is security standards. Under this factor, agencies should consider whether or not the application is vulnerable to security attacks and where it fits in the agency’s risk tolerance model.

At this point, the document has a special section about including risk in these factors. It says that the playbook does not expressly use risk as its own scoring metric; however, it acknowledges that it must be applied throughout. Here it wants the agencies to use probability impact analysis in determining applications that have a high likelihood of being targeted for attacks, having an impact on mission risk, and being subject to internal errors during change management events.

This kind of analysis is nuanced and could be a challenge for many organizations. Even the term “probability impact analysis” will have many running for the hills, as statistical literacy is a challenging skill for most. On the other hand, thinking about the mission impact can also be complicated, even though the playbook advises the use of existing business impact analyses for understanding the risk strategy.

It’s here where a clear step-by-step guide to understanding how to make sense of these variables and how to employ them in understanding an application rationalization plan for cloud migration and aligning that to how an agency mission can help.

Many agencies will look to NIST and FedRAMP for ways to implement these new requirements. NIST provides much of the security control baselines and risk management processes, and FedRAMP accelerates Cloud Service Provider (CSP) onboarding. Such NIST-based programs can pull in other industry standards as needed through cross-references in the NIST Cybersecurity Framework (CSF) Informative References to enable the best breed mix of rock-solid security principles and advanced risk management practices to better inform application portfolio rationalization, readying them for cloud migration.

One advanced risk management practice that is in the NIST CSF Information References program is the FAIR standard. The FAIR risk management methodology gives practitioners what is needed to make sense of the combination of business and technology factors to determine what is truly mission risk and what can be deprioritized.

FAIR is the only purpose-built quantitative risk management standard mapped to the NIST CSF framework to help agencies understand which control requirements are necessary to safeguard their mission. Its use helps organizations evaluate the factors associated with mission failure, such as outages, cyber-attacks, and change control errors.

Further, concepts such as probability impact analysis are expressed in plain language that agency leadership will better understand. Such syntactical shifts make use of accessible concepts like frequency, loaded cost, and a method of accounting for impact based on the activities associated with attack and failure, called activity-based costing.

For example, an agency looking to rationalize cloud migration for its critical applications would want to consider what impact an outage would have on mission success. FAIR provides the means to walk an agency through mission impact analysis using the narrative, storytelling approach articulated in activity-based costing, combined with FAIR’s six loss forms.

If an outage were to occur, it’s possible to walk through the narrative of what the impact would be to the community, constituents, and taxpayers. What kind of response costs would there be? Consider agency costs such as materials and personnel (number of people impacted x hourly loaded costs). Would there be an impact to the larger constituency? Some loss forms, such as replacement cost, productivity, and reputation would need to be reinterpreted to capture these outside impacts. Economic costs to impacted businesses and nonprofits reverberates through a community and can be detrimental.

Other, more damaging costs such as loss of life can be interpreted under replacement costs. Many agencies such as EPA have established values for human life that can be included in such calculations. Although it may appear unseemly on its face, not including these costs means you are not appropriately weighing the full impact of the risk scenario under examination.

Lastly, account for reputational cost by considering the economic impact of a regime change. If the risk scenario under evaluation could result in someone losing their elected or appointed office, what impact would that have on the momentum of the agency’s strategic plans?

The tools for building a rock-solid business case for cloud migration, such as is required by OMB’s Cloud Smart, is already available to agencies through the NIST CSF Informative Reference catalog. Further, there is a large community of risk practitioners utilizing these modern, quantitative methods supported by the non-profit FAIR Institute, which has a specialized Federal Government Chapter. Complying with these new requirements does not need to be onerous, it simply requires applying the right methodology to the right problem, and no prioritization exercise is complete without using a quantitative model to truly distinguish mission risk necessities from luxuries.

author avatar
Jack Freund
Dr. Jack Freund is a leading voice in cyber risk measurement and management. He is an expert at building relationships across the business to collaborate, persuade, and sell information risk and security programs. Jack is currently serving as Director, Risk Science at RiskLens and previously worked for TIAA as Director, Cyber Risk. Jack holds a PhD in Information Systems, Masters in Telecom and Project Management, and a BS in CIS. He holds the CISSP, CISA, CISM, CRISC, CIPP, and PMP designations. Jack has been named a Senior Member of the ISSA, IEEE, and ACM, a Visiting Professor, Academic Advisory Board member, and IAPP Fellow of Information Privacy. He is the 2018 recipient of ISACA’s John W. Lainhart IV Common Body of Knowledge Award, the FAIR Institute’s 2018 FAIR Champion Award, and presented Nova Southeastern University's Distinguished Alumni Award. Jack's book on quantifying risk, Measuring and Managing Information Risk: A FAIR Approach, was inducted into the Cybersecurity Canon in 2016. Jack’s writings have appeared in the ISSA Journal and he currently writes a column for the ISACA newsletter.
Jack Freund
Jack Freund
Dr. Jack Freund is a leading voice in cyber risk measurement and management. He is an expert at building relationships across the business to collaborate, persuade, and sell information risk and security programs. Jack is currently serving as Director, Risk Science at RiskLens and previously worked for TIAA as Director, Cyber Risk. Jack holds a PhD in Information Systems, Masters in Telecom and Project Management, and a BS in CIS. He holds the CISSP, CISA, CISM, CRISC, CIPP, and PMP designations. Jack has been named a Senior Member of the ISSA, IEEE, and ACM, a Visiting Professor, Academic Advisory Board member, and IAPP Fellow of Information Privacy. He is the 2018 recipient of ISACA’s John W. Lainhart IV Common Body of Knowledge Award, the FAIR Institute’s 2018 FAIR Champion Award, and presented Nova Southeastern University's Distinguished Alumni Award. Jack's book on quantifying risk, Measuring and Managing Information Risk: A FAIR Approach, was inducted into the Cybersecurity Canon in 2016. Jack’s writings have appeared in the ISSA Journal and he currently writes a column for the ISACA newsletter.

Related Articles

Latest Articles