Network forensics is the capture, storage and analysis of network traffic. You might also hear the term referred to as “packet mining,” “packet forensics,” or “digital forensics.” Regardless of the name, the concept is the same, with the objective to record every packet and the data it contains moving across the network and storing it for some period of time.
Simply put, this means having a network recorder that would allow you to see all emails, database queries, Web browsing activity, etc. In short, all the information traversing your corporate network is recorded. With access to a single repository that can be examined after the fact, any matters that require attention retrospectively can be addressed.
The typical use case for network forensics is security-related, where one is trying to capture the attack fingerprint and perform post-attack analysis for security exploits. The recent denial-of-service (DDoS) attacks on Bitbucket.org— a Web-based code hosting service that relies on Amazon’s Elastic Compute Cloud (EC2) — and this summer’s DDoS attacks on Facebook and Twitter are recent headline examples. Using network forensics, you’re able to analyze how the attack occurred, who was involved with that attack, the duration of the exploit, and the method use for the network attack.
Back in Time
Using network forensics can be like using a network time machine, allowing you to go back to a particular time point and reconstruct the sequence of events that occur at the time of a breach, allowing one to see the complete picture. We all know and have heard about security attacks, since they get the most attention due to the implications of data loss. These losses can be manifested not only as a loss of intellectual property, but a loss of customer trust. Either way, it equates to a loss of revenue dollars to your company.
Various types of Security forensic investigations can be carried out:
- Regulatory compliance / HR investigations to detect and analyze violations of HR policies or industry regulations and support compliance efforts for SOX, Gramm-Leach-Bliley, HIPPA, PCI etc.
- Transactional analysis to provide the “ultimate audit trail” for any transactions, where server logs and other server-based evidence doesn’t provide a thorough picture of a transaction
- Security attack analysis to enable security officers and IT staff to characterize and mitigate an attack that slipped past network defense (e.g. zero-day attack)
Beyond security, there are other day-to-day use cases for network forensics. This power of network traffic information allows one to improve and take proactive approaches in a network computing environment by going back in time. Now that IT staffs can turn back the clock, they can analyze anything associated with the network, such as improving network performance, tuning intrusion detection solutions, and identifying rogue device access to the net. In addition, network forensics can be used as a tool for monitoring user activity, monitoring business transaction analysis, and pinpointing the source of intermittent performance issues.
Some day-to-day examples:
- Network troubleshooting to handle any type of network problem, especially those that happen intermittently
- Network performance benchmarking to provide detailed reporting on network performance, bottlenecks, activities, etc.
- Optimization of the network, application or server environment, looking at transaction times, network times, traffic to a key boxes, how load balancing is really working
Different Approaches
Simson Garfinkel, author of several books on security, describes two approaches to network forensics systems:
- “Catch-it-as-you-can” systems, in which all packets passing through a certain traffic point are captured and written to storage, with analysis being done subsequently in batch mode.
- Pro: Comprehensive visibility into all network traffic looking backward (within a reasonable time period)
- Con: This approach requires large amounts of storage, usually involving a RAID system.
- “Stop, look and listen” systems, in which each packet is analyzed in a rudimentary way in memory and only certain information saved for future analysis.
- Pro: This approach requires less storage than a “Catch-it-as-you-can” system.
- Con: May require a faster processor to keep up with incoming traffic.
Since a key component of network forensics is to provide visibility in to what is not easily seen, make sure the solution chosen provides you with a real-time view into every part of the network, including Gigabit, 10GbE, Ethernet, 802.11a/b/g/n wireless, VoIP, and WAN links to remote offices.
As more and more networks are upgraded to 10Gig, the need for network forensics becomes imperative. Real-time, deep packet inspection on 10Gig links is not currently, and may never be, an option. With real-time analysis out of the picture, network forensics will become the primary means of replaying and troubleshooting network issues, whether performance-, compliance- or security-oriented. It will also become the primary means of collecting data for establishing network baselines and analyzing complex issues like poor application design.
Just because real-time analysis is removed as an expectation from network forensics solutions, the demands of 10Gig will certainly challenge existing network forensics appliances. Only appliances that take advantage of the very latest technologies, including CPUs, I/O bus technology and high-speed disk storage, and employ updated software that takes advantage of these advances, will meet the network forensics needs at 10Gig.
Network forensics can be a powerful device to unlock mysteries found within your network — provided you’re capturing the digital evidence now, before any specific event actually happens. Make sure you have a network forensics tool best suited for your organization’s particular needs.
Joe Habib is director of global services at Wildpackets.
Joe,
Good article, and I agree with all of your points. Just one comment occurred to me. You say "Real-time, deep packet inspection on 10Gig links is not currently, and may never be, an option".
There is a way actually. In systems we build for our customers, we use capture cards from Napatech which can load-balance a single 10Gig stream (on a flow basis, for example) across up to 32 CPU cores. The task of real-time DPI or other analysis at each of those cores is therefore scaled down substantially to under 1Gig, which is well within the scope of even software-based tools. Combining a sophisticated adapter such as the Napatech unit with the ever more powerful and lower cost multi-core servers available now means that real-time, line-rate 10Gig analysis is in fact both achievable and affordable right now.
Similar divide-and-conquer solution approaches are offered by other vendors in the market today also, as I’m sure you are aware.
Thanks for the article.
Peter