In the time-space continuum, any hack attack, even a huge one, may be just another speed bump on the road to progress. But of course, it also might be a sign that there are more and even bigger speed bumps ahead, and we need to avoid them.
That’s the best way to consider a recent episode that hasn’t received nearly the attention it deserves. In February, there were reports of a massive DDoS (Distributed Denial of Service) attack—so big, in fact, that it was probably Europe’s largest ever, and perhaps even the largest anywhere. Reports indicate that the perpetrators exploited vulnerabilities in the Network Time Protocol (NTP), which is designed to synchronize the clocks of computers over a network.
We don’t know too much yet, including specific targets. For the record, a DDoS hack is built around directing mountains of data at a particular network, causing the servers in it—and potentially the whole system—to crash. In this case, the concerted assault involved nearly 400 gigabits per second (Gbps). The Spamhaus attack, previously thought to be the largest DDoS hack, clocked in at 300 Gbps.
Time has passed since the NTP attack in February, as the clocks on computers still tell us. But looking ahead, we know it can allow exactly the kind of attack the NTP is still reeling from.
And in another example of the effects of the passage of time, there was the televised appearance of Edward Snowden at the South by Southwest festival in Austin. Just a few months ago, he was being both lionized and vilified as the poster boy for unauthorized access after he clandestinely downloaded reams of classified information from servers in the National Security Agency (NSA). Now, he seems to be turning the tables on his critics and accuses the government of “setting fire to the future of the Internet.”
Snowden said he was addressing tech geeks rather than policy types because they’re the people—“the makers, thinkers, and the dev community”—who can ensure data privacy and information security. In particular, he focused on encryption, which he referred to as not “an arcane dark art, but a protection against the dark arts." He also warned the audience that theft of encryption keys could be an even bigger threat.
But leaving aside the irony as a data security advocate, what can organizations that really are security-conscious, rather than those that let time fly, do about it? Security requires pairing the right strategy with the right technology, then ensuring rigorous execution. Here are the five basic ingredients in that brew.
First, there are administrative control measures, which ensure a necessary level of security and operational readiness, including visibility over the entire infrastructure, as well as easily auditable compliance for software-defined and cloud environments. This should be an obvious first-step towards eliminating errors, data breaches and other costly issues.
Then there’s two-factor authentication, the mechanism for verifying the identity of a user by using two separate pieces of information. This was previously a costly proposition for the hypervisor, but there are now effective and more economical options available.
Third is secondary approval, which enforces the “two-person rule” for high-impact administrative operations. As much as blocking rogue operators, it also helps prevent fraudulent access and costly disruptions caused accidentally.
Next comes role-based monitoring, which provides the fastest, strongest and most certain method of identifying security issues in an automated way, by showing anomalies in real-time. With this feature, an immediate alert is issued when actions or behavior patterns conflict with a user’s role.
And finally there’s the real question of data security. Organizations need policies, strategies and technologies to guard against broad access by privileged users or insiders (or those who acquire insider credentials for malicious intent); breaches or data center disasters caused by errors or misconfiguration; and maintaining the privacy of the data itself.