Remember the good old days of floppy disks and macro viruses? Back then, we thought things were complex. How could enterprises possibly gain any semblance of control over these new-fangled security threats that were targeting their users?
As years went by, we finally got our arms around this malware thing - until now. Maybe it's just me, but malware is all we seem to be hearing about in the IT headlines, and it is only getting worse. Bots, advanced persistent threats and the like seem to be the hot-button issue in IT security right now.
Spam, denial of service attacks and information leakage (to name a few) can all be sourced with ease from widespread malware infections. For example, Symantec's MessageLabs Intelligence has found that infected computers in some botnets send on average more than 600 spam e-mails per second. This is big business!
Of course, I also realize that the marketing machine is at work here and we cannot believe everything we hear. Trend Micro claims that 3.5 new malware threats are released every second. So what is that tens, if not hundreds, of thousands of encounters with malware in any given enterprise on any given day? Wow, is the sky falling?
On the other hand, Cisco ScanSafe claims that in 2010, a representative 15,000-seat enterprise would experience about 5.5 encounters with malware on any given day. That's a relatively low number I suppose, but it is still a very big problem.
Remember that security is about control and visibility. Reality has shown us that many enterprises do not really have the necessary control and visibility into their networks to keep the bad guys at bay. This is especially true when it comes to malware. Suddenly (albeit shortsightedly), security issues like Web-based SQL injection and lost laptops are taking a backseat so enterprises can get their arms around this "new wave" of malware out there.
I can attest to the complexities and problems associated with both sides of the equation. On the proactive side, people are not being, well, proactive enough with information security. The assumption is that we have policies, we have technical controls in place and we are not getting hit with malware (as far as we know), therefore all is well. It's not that simple, but still it is the way that many enterprises operate.
On the reactive side of the equation - that is once network administrators determine that's something is awry and an infection is present - enterprises tend not to have a reasonable response plan in place. Even when a seemingly appropriate response is carried out, it is often not adequately dealt with and the malware often comes back.
Case in point: I worked on a recent project where a large enterprise originally got hit with some nasty command and control malware. A few thousand computers were infected. They responded by cleaning up the affected systems but they didn't look deeply and broadly enough throughout their network to see where else the malware was lurking. A few months later, the bot reared its ugly head again. This time they were hit much harder and had more than 10,000 systems become infected. Ouch.
So what do we have to do if we are going to stand a chance against this (re)emerging malware threat? Big government politicians like Joe Lieberman believe that more regulation is the answer. In reality, if you look at the details of the proposed Rockefeller-Snowe Cybersecurity Act of 2009 (Senate Bill 773) and the Lieberman-Collins-Carper Protecting Cyberspace as a National Asset Act of 2010 (Senate Bill 3480) and combine them with the Federal government's track record, regulation will likely serve to cause more problems than it fixes. In fact, regulation and government interference in the free market is arguably one of the greatest threats to information security today.
Sure, given the right scenarios and people public-private partnerships could work well. In fact, many are saying that we need more cooperation between the Federal government and the private sector to help fend off cyber-threats. Isn't that called InfraGard?