I purchased my first network security book in 1996. Since then, I have attended dozens of classes, seminars and conferences on the subject. I have written dozens of articles and co-authored a book on it — and I am not alone. Conduct a search at Amazon.com for “network security” and more than 3,500 books are associated with the topic. In addition, numerous colleges and universities offer degree programs in network security and an untold number of security certifications have been created. One would think that with this increased focus on network security that digital data would be totally protected and difficult, if not impossible to steal or compromise.
Unfortunately, this is simply not the case. Technology does not stand still. New operating systems are created along with development of new applications and newer versions of old applications. New computing devices are also created. And all of these new developments add additional vulnerabilities to digital data. This means that the network security environment is constantly changing, and the “bad guys” are becoming more sophisticated. The lone teenage hacker is now being replaced by highly skilled, well-funded criminal organizations that are stealing information to generate revenue, not simply for bragging rights among their peers.
When an organization is evaluating the risks to their systems, there are four steps they can take:
• Ignore the risk. When you hear someone say, “I don’t believe that will ever happen to us,” or “No one would be interested in our systems because we are a small company,” they are ignoring the risks.
• Accept the risk. When a company understands the risk but does not apply any resources to protect from the risk, it is accepting the risk.
• Transfer the risk. For some organizations, it is more cost-effective to purchase insurance to protect from the repercussions of network security breaches as opposed to adding additional resources to security infrastructure.
• Mitigate the risk. When an organization dedicates resources to minimize the risks to the network, such as personnel, capital and equipment, it is mitigating the risk.
Of the three approaches, only “ignoring the risk” does not require some effort to determine the risks posed to systems. Before any determination can be made about how to approach risks, an organization must identify the systems that contain “mission critical” critical information and determine the impact of having that information compromised. This sounds like a simple process, but the biggest mistake made at this point is undervaluing the data on the systems.
It is fairly obvious (or should be) that losing healthcare or financial information would have a significant impact on a business. This can include lost reputation, loss of trust and numerous lawsuits — however, loss or corruption of business plans, marketing plans and client information can have just as significant an impact.
Preventing Unauthorized Access
Once a determination has been made as to the location and value of an organization’s digital assets, the next step is to determine how to prevent unauthorized access or modification of this information.
One common solution is the “product-based” solution. This involves the purchase and implementation of security products such as firewalls, intrusion detection systems, authentication mechanisms and encryption tools. While all of these tools (among others) are necessary for creating a secure network infrastructure, there are no products that are “set it and forget it” type tools. It is important to recognize that network security products are only as good as the individuals who configure, monitor and maintain them.
A much more effective solution is a “process-based” solution. The Federal Financial Institutions Examination Council provides an excellent definition of the security process in its Information Security booklet, “The security process is the method an organization uses to implement and achieve its security objectives. The process is designed to identify, measure, manage and control the risks to system and data availability, integrity, and confidentiality, and to ensure accountability for system actions.”
(http://www.ffiec.gov/ffiecinfobase/booklets/information_security/information_security.pdf) While this document is geared toward financial institutions, the material it contains provides an excellent approach to creating a secure infrastructure.
When organizations begin addressing network risk management issues, the tendency is to immediately focus on threats and vulnerabilities, directing resources at known concerns. While this approach is extremely logical, some time should be spent addressing preliminary issues such as where does the IT security role fit into the organization, benchmarking systems before they are compromised, allowing enough time for the IT staff to address security issues, and providing frequent training opportunities to keep up with current trends.
Getting on the Same Page with IT
The first step is that IT and IT Security should be included as part of the management process. In many organizations, the IT departments are separate entities that may report to only one member of the management team. IT issues should be integrated into the decision-making part of the business.
Monthly management meetings should include updates on the current status of the systems, any security issues that developed during the previous month, current trends and new threats that have appeared. These meetings can provide the opportunity for discussion and allow for prompt decision making regarding problems. In many organizations, the IT departments are completely isolated from the management team — usually on separate floors (sometimes separate buildings) — and the only communication between IT departments and members of management is an e-mail announcing a particular threat or problem.
IT and security professionals must learn to discuss risks from a financial perspective if at all possible. Business professionals often make decisions based on the financial impact to their organization. This can be a challenge, as technology experts have limited experience with business issues.
Another overlooked item as part of the network risk management process is benchmarking. Prior to placing a system onto the network, the standard “posture” of that system should be identified and documented.
Items that should be documented include: installed applications, open ports, running processes and services, applications scheduled to run at startup, etc. The system should be tested to ensure that the most recent patches and service packs have been installed. Tools such as the Microsoft Baseline Security Analyzer (www.microsoft.com/technet/security/tools/mbsahome.mspx) and the scoring tools provided by the Center for Internet Security (www.cisecurity.org) could prove helpful with this process.
The benefit of establishing a benchmark for systems is that it becomes much easier to identify when a system has been compromised. If a team knows how a system is configured, anomalies can be easily pinpointed when responding to an incident. Organizations should make every effort to avoid the rush to get systems into production so that the benchmarking process can be completed. Other useful tools include WinAudit (www.pxserver.com/WinAudit.htm), and PStools, (technet.microsoft.com/en-us/sysinternals/bb896649.aspx).
Another important part of the benchmarking process is generating a network map. This can be done manually or can be accomplished by using an automated tool. Knowing what devices are supposed to be on the network can prove helpful when responding to problems. During a network vulnerability assessment of a client’s network we identified a Solaris system that the IT department had no idea was connected to the network. During the ensuing investigation it was determined that the system was purchased by an engineer at a garage sale.
Tools that can be used for network mapping include the open source tool, CartoReso (cartoreso.campus.ecp.fr/) and the commercial tool LanSurveyor by SolarWinds (www.neon.com/LSwin.shtml).
The IT staff should avoid the “hack and patch” approach of only installing patches when new vulnerabilities (hacks) are announced. This does not allow for the IT staff to be in a position to monitor systems or prevent intrusions from new or undocumented vulnerabilities. In addition, many “attacks” are designed to be as unobtrusive as possible, and require close attention to the network and systems to identify them. Sophisticated attackers will not “pound at the door” trying to break in, they will take their time and find the appropriate attack vector so their efforts go undetected.
It is important to remember that the cost of preventing an intrusion is usually less than the cost of responding to an intrusion. Providing the IT department enough time to address security issues can reduce the risk of a security incident and reduce the cost of responding to one as well.
The proper training of the personnel that are tasked with addressing risks and securing the infrastructure is an important consideration. A common misconception among non-technical individuals is that “someone who is really good with computers” can address any computer-related issues. In reality, computer experts specialize in particular technology areas. Security is a very specialized area and requires appropriate training and knowledge. Security professionals must be in a position to both identify vulnerabilities on a network, but also be in a position to identify attack patterns on the network.
Perhaps the most difficult concept for management to embrace is that security professionals are in a constant state of learning. Because new attack methodologies are created as soon as new defenses are developed, it is important to track new trends and issues. This cannot be accomplished by attending one class or conference per year. New threats emerge regularly — some would argue daily — and trying to play catch up once a year is simply not enough.
Those intent on addressing security issues and risks should be actively involved in networking with other security professionals. They should also monitor security focused websites to track postings of new issues. Knowing issues that face currently installed software as well as popular operating systems is absolutely mandatory.
When I first started studying network security, the primary focus was on perimeter defenses, such as firewalls, access control lists, and intrusion detection systems. Monitoring incoming traffic was the critical issue. Now, with the creation of bots and rootkits, it is equally important to monitor outgoing traffic as well. End-users are now being targeted more frequently.
Younger employees are now much more technically sophisticated than in previous generations and are more likely to install unauthorized software or connect personal devices to the network or systems. In one organization, an employee actually used client credit card numbers to shop online during business hours.
Network risk management is a complex process requiring time, resources and skills; however, with the proper planning and research, organizations can address threats in a timely and cost-effective manner..
John Mallery is a managing consultant for BKD LLP, where he works in the Forensics and Dispute Consulting unit and specializes in computer forensics. He is also a co-author of “Hardening Network Security,” published by McGraw-Hill. He can be reached at firstname.lastname@example.org.