Tuesday, January 6, 2015

Are We Asking the Right Questions
in Wake of the Sony Pictures Breach?

Posted by - Paul Martini, CEO & Co-Founder


Much has been written about the Sony Pictures data breach and no doubt, more will be revealed as time goes on. It is the latest in a string of high-profile attacks that adds Sony, for the second time in recent years, to a litany of marquee breaches that includes eBay, Target, Home Depot, JPMorgan Chase and others. Incidents like these seem to surface daily and although the origin and details of most breaches are eventually determined, the origin of the Sony breach, as well as the perpetrators remain up for debate. The US government claims the incident was a nation-state attack launched by North Korea, while cyber security experts seem less sure that’s the case. One security intelligence firm investigating the incident argues for the involvement of an embittered ex-Sony employee, with in-depth knowledge of Sony’s network infrastructure. In any case, theories abound and it seems as more time passes, the theories offered become more diverse. Most of them are focused on trying to discover the nature of the attack. What malware was used? How did it operate? When did it first enter the network and more importantly, how can we keep it from getting into our organization?

But are these the only questions we should be asking?

One question surrounding the Sony incident, as well as the other breaches mentioned, seems to be receiving less focus. How did the loss of terabytes of data go unnoticed? What was occurring as huge amounts of data left the network? It doesn’t seem to be a case of too many alerts, as in the Target breach, where IT became inured to the noise. In fact, according to many reports, Sony’s security processes were somewhat lacking and simple policies that could have strengthened password protection, or restricted network access, were never deployed. Yet, regardless of Sony’s possible lax policies, or whether or not their inbound defenses were robust, the fact remains that there’s only one gateway out of the network and having a terabyte of data leave, didn’t happen instantaneously.

The growing number of data breach incidents highlight the reality of today’s threat landscape – no matter how sophisticated inbound security technology becomes, it’s unrealistic to believe you can stop all threats. Network administrators have traditionally been advised to focus on inbound security and certainly it’s a critical piece of the puzzle. Many security vendors are bolstering their inbound defense solutions by adding sandboxing to detect the evasive malware standard signature-based defenses may miss. These measures are important, but they still address only 50% of the problem. In addition, while sandboxing will strengthen security, many of today’s threats are designed to evade sandboxes, and it only takes one piece of malware getting through to compromise your network.

The Key is to focus on data movement

The other half of the security equation must be answered by addressing the problem of data movement across the network – particularly outbound. In the wake of so much data loss, some security analysts are now advising organizations to focus on outbound security as diligently as they focus on blocking inbound threats. And this does not mean only detecting callbacks, such as bots trying to communicate with command and control (C&C) outside the network. Monitoring data movement effectively requires visibility across all data channels, and the ability to correlate data movements against high-risk factors such as machines and geo-location. Vendors that can deliver comprehensive visibility across the full Web stream, plus provide the tools required to detect anomalies as they occur and stop data transfers in real time, can spell the difference between minor data exposure and catastrophic loss. Imagine if Sony had been aware of a spike in their outbound traffic, or had been able to see data moving from or to a high-risk location? Wouldn’t they have been better off losing a few kilobytes of data rather than a terabyte? Yet, this raises the specter of being flooded with alerts. In the Target breach there were so many alerts they were ultimately ignored, likely because they were considered false positives. This is a significant issue and vendors are scrambling to reduce the noise, but even reducing alerts isn’t enough. Today’s security also needs the ability to translate events and data logs into actionable intelligence, giving you the complete context of a threat; when and where it entered the network, what machines are infected, and how to remediate the problem quickly.

We can criticize organizations that were attacked for having poor security and believe that we have superior solutions in place, but the evidence speaks for itself -- even the largest enterprises, with the deepest pockets, are vulnerable. That’s why strong inbound threat defenses have to be matched by tools that can continuously monitor data movement, detect anomalies and deliver actionable intelligence. It’s the only way to empower your IT staff and reduce the devastating data loss so prevalent in today’s threat environment.

Learn about iboss FireSphere’s unique Network Anomaly Detection