Blog Archives

Analytic Evolution

Event management has evolved over the years from ICMP probes, snmp traps, and syslog data to a very sophisticated market place where a myriad of products await perspective technology shoppers. The underlying technology behind these products, can be described in essence as analytics, which is a vital part of any security environment, read on to understand the evolution and where we are today.  

The most secure environment in the world is not viable, if there is no mechanism for the security environment to alert the operators to malicious or suspicious activity! 

Technology is available today that reliably provides what I would describe as 2nd Generation Analytics for the health and success of deployed security systems. I say 2nd generation because Network Management and Systems Management software has been available for years that collected alerts and provided indications of hard and soft failures for specific vendor devices, but technology to determine relevancy and severity has been hot and cold.  

My favorite all time network monitoring program was simple and reliable, it worked so well Cisco included it in one of it’s management products; CastleRock SNMP. You could tell if a device or a link or a service was UP or DOWN instantly. Now everything from Cisco is java-based.

The original method of event processing was relegated to specific failure modes and had no means of identifying a security breach outside of a Denial-of-Service attack. This interrogation method of event management gave way to distributed event monitoring and event consolidation because the masses were sold on the idea that in order to determine the disposition of an anomalous event one must incorporate more security devices and more event data. 

The approach of 2nd Generation Analytics is: gather as much data as you can to determine the extent and impact of various events thereby achieving rudimentary correlation. 2nd Generation Analytics provided a means to blend events from an IDS with data from a deployed vulnerability system and determine whether a particular event is relevant, then optionally compare access logs on the webserver to determine success or failure of the attempt. 

Products are available that can consolidate alerts and alarms for supported platforms and events as well as push events in semi-standardized formats but as the above example suggests, an organization has to implement multiple technologies in order to know whether the data on the webserver is safe and secure 

Alongside this 2nd Generation Analytics, is the notion of management systems for management systems. A Manager of Managers, MOM if you will, where the muscle of technology is wielded to integrate disparate management systems in the hopes of creating a single cockpit-style view of events. The key difference between the two being MOM requires underlying management systems to push processed event data into another management system while 2nd Generation Analytics solutions interact directly with the security environment or other infrastructure. 

1st Generation Analytics brought us device interrogation, 2nd Generation Analytics brought us consolidation and limited correlation, so what’s next? Minority Report? Shut down users or devices before they break the network? {Sounds like a great commercial idea for Cisco’s Self Defending Networks}

3rd Generation Analytic technology is entering the market place, but the technology needs maturization. Statistical Anomaly Detection and Behavioral Analysis are the current incarnations and there will no doubt be others, these technologies seek to apply complex mathematical techniques to events occurring within security environment and the rest of an organization’s technology infrastructure to make alerting and correlation decisions. The intent being to answer the question of ‘What is normal?’ by looking at the questions: How is this resource accessed, When is this resource accessed, Where is the user accessing this resource, etc, in hopes of understanding the answer to ‘Is my data safe and secure?’  

The validation for this technology will come with interpreting multiple data sources accurately and its ability to “learn” what is normal for company A that is abnormal for company B. There is a saturation threshold with this approach when too many event sources are being analyzed that results in something worse than False Positives, which is False Negatives: an organization is back to thinking they are secure because the analytics are unable to correlate some or all of the event sources and don’t process critical event information occurring from an actual breach. 

The evolution of analytics has brought improved monitoring and alerting for the security environment but analytics still suffer from the issue of fidelity, which becomes increasingly important with each successive leap in analytic technology. If the analytics are not trusted and not acted upon, then the security environment cannot fulfill its purpose, and the analytics become irrelevant.

Stay tuned, more to come…