As the world continues to generate a ﬂood of information, governments, businesses and individuals understand nefarious players are everywhere, waiting in the wings to wreak havoc, potentially on their own organization’s data assets.
Whether it’s for proﬁt on the black market, or to undermine national or international security, or to obtain intellectual secrets, the threat of a cyberattack is real and inevitable; making cybersecurity one of the most important issues of our day.
Yet, despite the massive amounts of spending on cybersecurity solutions, the frequency and complexity of breaches continues to increase … and the cost in the aftermath of a breach has skyrocketed. With increasingly sophisticated blended threats, combined with massive data volumes that create scalability and speed issues, it’s understandable why the ability to detect a breach has been so elusive, and why detection times are so long. According to security firm Mandiant’s 2014 Threat Detection Report, the median number of days that a threat was present on a victim’s network before detection was 229 days.
Simply put, organizations don’t know what they don’t know, they can’t see what they can’t see,and as a result, they cannot detect—much less respond to—an anomaly or potential threat that they don’t even know is either coming at them or already in their system.
Immediate threat detection requires the ability to take in all the information emanating from an organization’s many data feeds throughout its entire enterprise infrastructure — all at the same time, all the time, and into one simpliﬁed, uniﬁed view. Then, that data must be acted upon when it initially becomes available and while it’s still moving in the system—or what we call, “data in motion”—in order to surgically pinpoint a potential anomaly or behavior that could pose a threat.
Technology that is architecturally designed to process “data in motion” is able to ingest everything—i.e.; all the data along with its context from all of the disparate data feeds found in a typical organization’s IT infrastructure: routers, ﬁrewall, IPS, antivirus, SIEM, syslogs, netflow, switches and more. Then, leveraging advances in artificial intelligence and machine learning, the analytics are performed on the fly, creating correlations that can identify a potential threat to the network. Every component of such a system is created to act on the data before it ever comes to rest.
This is very different than the majority of the systems in use today, which rely on an “analytics-after-storage” model where data is gathered and stored in a database. Then, analytics is performed in batches, but after the data has persisted —or has come to rest.
This model claims to be real time, but only the ability to query a data repository is real time—or, as the user is waiting, in reality the analytics becomes forensic in nature. Data that has already come to rest has an inherent latency and does not give organizations the vital rapid detection edge they need to stop an attack at its onset.
It takes extraordinary computing power to handle the unprecedented volumes and types of data flowing into to the typical enterprise infrastructure where organizations are geographically distributed, with people and offices around the world, operating devices that allow them to enter the network from anywhere, anyplace, at any time.
It also requires the application of new advances in correlation and algorithmic technology so that organizations have the correlated results available. This provides the ability to identify patterns and detect anomalies without any rules or signatures. To date, the majority of existing systems have been highly dependent on identifying known threats—those based on a signature or rule—despite the fact many of today’s threats are never-before-seen attacks — the “unknown unknowns.”
Technology that can ingest, analyze, index and correlate massive amounts of data from disparate data feeds, while that data is still in motion, will enable organizations to rapidly detect anomalies, identify whether or not they are a threat, and then automate client-determined remedial actions.
Data is a fluid, moving entity. It is constantly in transit either into, out of, or within an organization. To protect and defend data assets, companies need the ability to see what is happening to data while it is moving and in transit.
Technology advances are being implemented today that will begin to dramatically shift the breach detection paradigm by giving companies the ability to detect a potential breach faster than ever before possible. And while we may never be able to thwart every potential threat, we most certainly have the technological capability to drastically reduce detection times from months to just a few minutes. We just have to embrace them. And when we do, the balance of power will finally shift in our favor.
Dr. Dan Nieten is chief technology officer for Red Lambda, a cybersecurity technology company. The company’s flagship solution, MetaGrid, is an advanced, software-based cybersecurity system designed to protect commercial and government enterprises by identifying anomalies and threats “in motion” at detection speeds never before possible, without rules or signatures.