To say that the recent Office of Personnel Management (OPM) data breach has had wide-reaching effects is an understatement. With 21.5 million records exposed, the incident left our national security and millions of people vulnerable. The massive breach underscored the necessity for radical change in the approach to dealing with the rapidly evolving security threat landscape.
Unfortunately, this radical change — in an environment with the size and complexity of the Federal Government — cannot be accomplished with the immediacy required to strengthen our cyber defense. What can be realized is a more vigilant approach focused on practical steps that can be taken now to improve the security posture of our networks.
While reactive measures are a necessity in a standard security incident response, a disciplined and proactive approach to security compliance and compliance management is equally necessary and vastly more impactful to the information security lifecycle. Unless security compliance is viewed as a marathon and not a sprint, sensitive data will continue to be vulnerable.
Data security’s Achilles heel
Many organizations have adopted the notion that compliance ensures sensitive data is safe. The definition of compliance suggests that laying out a set of goals, instructions, procedures, etc., and adhering to them ensures security. This “check-the-box” compliance mentality is quickly becoming data security’s Achilles heel. As threats to sensitive data continue to evolve, systems and system security are constant moving targets, and the compliance processes used to protect them should be flexible enough to adapt quickly and effectively. Too often in today’s model, by the time compliance processes are tested, approved and released, they are already obsolete.
While compliance is an important function and should be supported, it’s seldom the best gauge by which to judge the security posture of an organization.
Placing an emphasis on achieving compliance percentages at the lowest possible cost as the primary goal negatively impacts the more important goal of demonstrable security posture improvement. The definition of compliance needs to be broader and less focused on the specifics and checked boxes.
The federal government has begun to realize this in recent years and has started to shift away from strict compliance and transition to more of a “continuous improvement” model. This shift signals the growing acceptance of a well-known security axiom – "good enough" security now is better than "perfect" security never. The recognition that perfect security is an impossibility should not be viewed as the acceptanceof defeat, but rather as the motivator for the dogged perseverance necessary to get as close as possible to perfection, and then improve on the process to get even closer the next time.
Two sides of the security coin
In order to meet the objective of overall security posture improvement, it is essential that offensive and defensive security processes within an organization work collaboratively as cogs in the same wheel. However, that isn’t always the case. Often, continuous process is hindered by a lack of effective coordination and collaboration between the security team (responsible for overseeing compliance) and the patch and release management (P&RM) team (responsible for getting the systems into compliance). While the security team may be wondering, “Why can’t P&RM just patch the systems?” the P&RM team may be asking, “How do we get this done ASAP?”
The more important question is, “How do we bring both sides of the security coin together?”
The offensive side of the coin
Security tools are written and designed by security companies for security professionals. These tools are primarily focused on the “offensive” side of the coin. They are typically designed to find vulnerabilities within systems. They scan, search for and exploit known vulnerabilities using definitions derived from a known vulnerability database. Many of the more advanced tools are even capable of some rudimentary Artificial Intelligence, allowing them to find 0-Day flaws/vulnerabilities based on known attack vectors, i.e., methods used by previous known exploits.
The reputation of a vulnerability assessment tool rests squarely on a single factor — accuracy. Insufficient accuracy produces false positives, or worse, false negatives. False positives are reported vulnerabilities that do not exist, while false negatives are unreported or missed vulnerabilities that do exist. It is clear that the consequence of a false negative is potentially far more severe than that of a false positive.
The widely accepted solution employed by security tool vendors to address this risk is to err on the side of caution in their attempt to prevent false negatives. This produces a higher rate of false positives, which have to be investigated and disproved as part of an organization’s vulnerability remediation procedures. In smaller networks, this may be relatively inconsequential, but in large enterprise environments consisting of tens of thousands scanned endpoints, even a one percent false positive rate can consume significant resources to investigate and dispel.
The defensive side of the coin
On the other side of the coin — the defensive side — are the tools that are primarily designed by either system manufacturers or by companies that have created products to sell to the operations and maintenance (O&M) departments. Everyone in the O&M world knows that their systems will have vulnerabilities discovered within them that will need to be fixed — such is the nature of the beast. Enterprise client management tools are widely used in the federal government to ease the process of applying patches and fixes to systems. These tools can only apply patches and fixes to systems once they have been written, tested and released by manufacturers, therefore they are seen as defensive or reactive in nature.
The need for a continuous process
Clearly, there are potential gaps between finding “possible” holes using an offensive mindset and patching known holes with new software fixes from manufacturers. The biggest issue at hand is that manufacturers are always going to be a step behind, so defense is going to be in a permanent state of “catch up” with the offense.
The other big problem is that different security tools produce different results. A security team member may scan 100 systems with three different security tools and get three different results. The same is sometimes true of the tools created forsystem administrators, although usually to a lesser extent because they are looking for an installed piece of software that generally is or is not present.
This disparity in collected data leads not only to finger pointing between security and P&RM teams, but worst of all, it leads to management never having a clear picture of the security posture of their networks and systems. Because the data doesn’t always correlate, they are left questioning the state of their safety, security and compliance.
Bring both sides of the security coin together
Closer integration of the security and P&RM functions has the potential to positively affect both short- and long-term trends in our security posture. The necessity of this integration is more important than the methods by which it is achieved. One possible solution might be the creation of a compliance group that exists between the security and P&RM teams. This group could, in theory, work together to analyze and normalize the data between their various reports and systems, addressing issues on a case-by-case basis if necessary.
In turn, the integrated team would have the capability to produce meaningful data for management about the true state of the network. In addition, the integrated team could develop a standardized process to normalize data produced by the various tools used to monitor and manage compliance, leverage data analytics to ensure accurate reporting and an intelligent response, and implement a standardized continuous improvement model that precipitates a more effective P&RM process.
By encouraging both sides of the security coin to approach compliance as a continuous process, like a marathon, and utilize an integrated approach, results will be more impactful to the information security lifecycle.
Shane Van Wyngaardt is an executive vice president and co-founder of Indigo IT. He has over 20 years of IT management and consulting experience, with specialized expertise in enterprise collaboration and data analytics. He has a degree in Electronic Data Analysis and Processing from Boston Computer College in Cape Town, South Africa and holds certifications that include: CCNA, MCSE, Symantec Enterprise Vault Technical Specialist (STS), Symantec Clearwell eDiscovery Platform Sales Expert Plus (SSE+) and ITIL v3 Foundation.
Norm Frazier is a solutions architect at Indigo IT with more than 19 years of experience. He specializes in open source technologies, messaging and DNS infrastructure and information security. He graduated from the University of Virginia with a degree in Cognitive Science and holds several certifications that include: CCNA, MCSE, CISSP, GIAC Security Essentials Certification (GSEC), GIAC Certified Enterprise Defender (GCED), ITILv3 Foundation.