Thursday, November 13, 2014

Data Losses Could be Drastically Reduced with Network Baselining
Posted by - Paula Wheeldon, Marketing Team

The spate of recent large-scale data breaches, where hundreds of millions of records were lost, makes it quite clear that cyber criminals are adept at keeping pace with security technology. Besides the inventive malware they create, they are also able to take advantage of the inevitable software vulnerabilities that crop up. Companies are investing in beefing up gateway inbound security, but even if you can block 99% of malware, you are still leaving an opening for one percent to get through, and that could be all it takes.

Surprisingly, a Ponemon research study found that only 20% of companies employ continuous infection monitoring as part of their security strategies. Yet leveraging continuous outbound monitoring and network anomaly detection is your best bet for mitigating the risk of the evasive malware that is bound to make it past your gateway security. And while many vendors advise adding sandboxing to boost security, cyber criminals are using sophisticated tactics to create signatureless malware that can evade sandboxes -- for instance, malware delivered in harmless pieces, to be reassembled in your network. Or, malware that can detect when it is in a sandbox and go dormant, only to be activated once you let it pass.

Another damaging aspect of data breaches – perhaps the worst part – is the time that elapses between when a breach is made and when it is discovered. In fact, victim companies often were notified of the breach by a third party. Research varies on the average number of days before discovery – anywhere from 43 to 229 – but regardless of the time span, multiple days with no remediation means a lot of private data leaving the network. That’s why diminishing malware dwell time is just as important in reducing data loss, as stopping breaches in the first place.

Network anomaly detection through baselining begins by establishing a baseline for network traffic, using historical data logs. Depending on your industry, multiple baselines may be indicated to compensate for variations in traffic that might occur during weekends, etc. Once you have baselines, measurement parameters and alert triggers established, you must continuously monitor outbound traffic across the full Web stream. When an anomaly is detected, you will be alerted and the traffic stopped, giving you time to investigate and remediate any problems. The concept is simple, and with the right solution in place, you will dramatically reduce data loss without generating false positives. Imagine if Target had lost only 1,000 credit card numbers, instead of 70 million? The story would have centered on how they avoided a devastating data loss, instead of how they incurred one, and they might even have avoided the global publicity.

Here are examples of some huge data breaches involving hacker attacks, where baselining would have reduced dwell time and data loss. If you consider that Ponemon Research puts the average cost of a data breach at $188 per record, the financial impact for some of these companies could be in the billions of dollars:

Company Time from breach to discovery Records lost
Home Depot 20 to 24 weeks 56M
JP Morgan Chase 4 to 6 weeks 76M
eBay 4 to 8 weeks 145M
Adobe 2 to 10 weeks 152M


Preventing breaches from occurring is the ideal, but certainly diminishing it with a technique such as network anomaly detection, would drastically reduce the financial and brand damage which affect large and small organizations alike.

Learn more: Watch a new on-demand webcast featuring iboss CEO Paul Martini and and guest, Forrester Research Analysts, Rick Holland: Security for 2015 and Beyond: The Role of Network Baselining in Advanced APT Defense