Meeting Compliance - “Old Tools? Tread Carefully or Revamp to a Higher Tier Security”

 Originally published on January 04, 2018 by Matt Conran
Last updated on March 03, 2022 • 17 minute read

The security space is filled with solutions that focus firmly on the detection side of the cyberspace continuum. Almost all of the industry is talking about cutting-edge detection and automated DDoS mechanisms. When you examine the security landscape, it's not only the things you know that are bad on your network, it's the things that you are not aware of yet that cause the damage.

Without the complete end to end visibility of the network, there will never be a full understanding of what's happening under the hood, making it increasingly hard to resist a cyber threat. If you can't see it, you can only “think” you are safe. This split-opens the door to an important aspect, that is ‘complete visibility’. Complete visibility along with tools such as Scrutinizer is the key requirement when identifying and classifying security threats.

The operator is the one who identifies and retaliates to the attack. Therefore, vigilance must be embedded into the culture of their work. He or she is the best anomaly detector. When aided with the right visibility tool, a good operator can figure out the rest of the cyberspace puzzle. One cannot wear the normal night-vision goggle when travelling to space; one needs the specially designed space spectacles because they are designed to impart an enhanced vision. Let’s understand more about the criticality about the operator’s role and how the operator can clip the wings of the threat.

Value Of The Operator

If you want to keep the network and data safe, a complete understanding of what is normal and abnormal is necessary. This requires apprehending the baseline of normal network behaviour. This is where the back-room boy, the operator is sought after. You cannot fully understand what is happening without the input of the operator.

Only the operator can understand the macro patterns before triggering the initial investigation. This is what’s known as “situational awareness” to what’s happening in and around. Undoubtedly, a machine can't do this since an artificial brain has certain limitations.

An automated detection system could trigger a long-live Secure Sockets Layer (SSL) connection to China as an anomaly. The long-lived session could simply be a remote student travelling and performing normal duties at the same time.

False positives cause threat-alert fatigue. We are in a world of false alerts which is draining the effectiveness of the most advanced security tools. The operator, if powered with the right visibility tool, can accurately determine what's normal or abnormal and what needs further investigation.

Cycle of the Operator - Triage Investigation

Not only does it give the operator situational awareness and the ability to spot right or wrong and eradicating false positives but it also bestows the capacity to do triage investigation, an area the majority of security tools fall short. Triage investigation is a microscopic lens that aids in identifying micro threats that snowball rapidly.

A sophisticated virus detector may detect a virus but it doesn't instruct the next sequence of fortifying steps. A virus detector will not tell you who else has the virus, how and where the malware is moving through the network.

A tool that detects the anomaly but doesn't give the operator the ability to search, hunt and gather information to define a mitigation plan is a worthless security device. The best option is to use a visibility tool that maps and understands the complete compromise and reaches the bad actor across the network.

Every avenue must be read along with the ability to quickly block all roads, all at once, bringing the operator into the loop at full cycle. A valid approach to security is more about the cycle of the operator acting as the central figure to hunt and close down the advisory. Therefore, there is no point panicking like a fish in troubled waters. It is the operator who has the ability to pour oil on troubled water and make it serene. 

Case Study - Universities & Compliance

Universities are the target-rich environments and many of them as a matter of policy, give students direct, unfiltered access to the Internet, yet they need to comply with certain compliance standards.

However, the average 'survival time' of unprotected computers connected to the Internet is very low. Therefore, such a dynamic environment where students come and go with multiple IP connected devices is a challenge.

Some incoming SSL and Remote Desktop Protocol (RDP) attempts are entirely valid; others may not. There is a lot of traffic movement, students and device types on a university network, making false positives high. Hence, the universities are one of the networks that are easy pickings for the attackers. The network of a university contains highly sensitive data (Personally Identifiable Information) that can pose a threat if compromised. This triggers the demand of a security gear that can take the challenge of multiple IP connected devices and can safeguard the network. 

Data Exfiltration

One of the key areas for any compliance is to focus on data exfiltration and being proactive about finding ongoing streams of data, leaving protected systems to outside of the network. Data leakage is often used for blackmailing and more commonly: identity fraud.

A lot of the compliance related issues are focused on basic commonalities, data exfiltration in particular. Data infiltration violates many standards and rules.

Visibility is fundamental to compliance of all kinds. The reason for that is because infiltration is a violation of privacy, data and confidentiality. If you do not have data exfiltration, you are not compliant and unfortunately the game is over. Attackers are waiting in the wings to pirate the tricks of your trade.

NetFlow & Data Exfiltration

NetFlow is paramount as a data leak detection and prevention tool. It informs the operator who initiated the data transfer, the IP addresses involved in the transaction, the amount of data transfer to name a few.

 

NetFlow analysis informs the operator if critical information is being sent out of the network and laterally within the network. It will pinpoint exactly where the blazing trail is that is not easily noticeable with other mechanisms.

NetFlow can detect if traffic is leaving at unusual times to hosts the sensitive systems that usually don't have transactions. Or even a transaction to the sites that don’t have any content or the illegal file-sharing websites. All this data can be correlated with the operator to indicate a compromise.

Data Leak Example

A bad actor can send malicious traffic to compromise an unsecured host in a particular segment. From here the advisory creates an additional foothold through skill and determination, eventually seeking out vulnerabilities on other systems where Personally Identifiable Information (PII) information is stored.

The initial host can then be used to extract the information back out of the organization's network or even laterally throughout the network. Different techniques can be used to exfiltrate the data such as Domain Name System (DNS) tunneling, Internet Control Message Protocol (ICMP) or more social media accounts.

For example, a bad actor can encode the information into DNS packets to extract the data. Sensitive patient information can be bundled into DNS packets.

Protocols such as DNS and ICMP and social media accounts like Twitter that are not viewed as data transfer mechanisms, often go unmonitored by onsite security tools such as an Intrusion Protection system (IPS) or Firewall.

NetFlow Capabilities

NetFlow weeds out false positives and sharpens the level of certainty that a given connection is an anomaly or not. It empowers the operators to drill down to IP address, port, and wherein the network the sessions are coming from, how big the sessions are and other protocol properties.

It acts as the Google Map of the network, informing the operator, the spots of heavy traffic or areas where there has been a potential accident, indicating a security breach. It equips the operator with the vision to track down any budding attack or a threat that is likely to be transpired.

The filtering interface gives the ability to navigate in time to either narrow or broaden the view or even go further in the past or future. This enables operators to rewind the network and analyze the past events detecting low and slow attacks that often get unnoticed for months.

This type of accurate analysis is needed for compliance with various rules. Apart from signaling the attack, NetFlow offers a retrospective analysis of what took place on the network and how files were transferred throughout the network. So, you do not have to shoot in the dark by depending on assumption. You would have a blueprint of all the chronological events in front of you.

 

i NetFlow is a protocol for collecting, aggregating and recording traffic flow data in a network. NetFlow data provide a more granular view of how bandwidth and network traffic are being used than other monitoring solutions, such as SNMP. NetFlow was developed by Cisco and is embedded in Cisco’s IOS software on the company’s routers and switches and has been supported on almost all Cisco devices since the 11.1 train of Cisco IOS Software. Read more ...

 

iAn IP address (internet protocol address) is a numerical representation that uniquely identifies a specific interface on the network. IP addresses are binary numbers but are typically expressed in decimal form (IPv4) or hexadecimal form (IPv6) to make reading and using them easier for humans. Read more ...

Everything touches the network and every conversation touches the infrastructure equipment. NetFlow is the perfect tool to gather cache and export information for threat analysis. It is an essential tool to aid mammoth organizations for any compliance requirements.

NetFlow Traffic Groups

NetFlow gets all the ducks in a row by organizing traffic into traffic groups. Traffic groups make up a logical view of the network, label and group sensitive systems so they can be examined for compliance rules.

It gives utmost significance to the exfiltration of data. It looks for streams of large amount of data from in to out, specifically originating from labeled sensitive systems. These systems are well characterized as to who is connecting and to what services.

The key to operator analysis is that these systems should not have any data leave the network. NetFlow profiles these systems with a baseline and sends alerts when there is an unusual request, for example, bilateral external communication. In ally with the operator's knowledge of what's sensitive or not and NetFlow proficiency, makes early detection possible.

Packet Capture And NetFlow

Compliance regulations have data privacy based law. The online packet captures have significant privacy implications which an organization can wind up by getting blocked on a matter of principle. Netflow based data sources do not comment on Patient Identity Management (PIM) information.

NetFlow simply does not have this payload information. This is what allows NetFlow to scale and add a layer of privacy that although makes it more acceptable to use but this makes it more complex. As organizations grow, information sharing grows along with it. Every digital relationship presents a new set of vulnerabilities. Therefore, it is advocated to revamp the security system. If you use old tools for compliance, you need to tread carefully.

Summary

Indications like large data movements, strange connection patterns could indicate exfiltration. These need to be investigated and someone has to come to the consultation if you are liable under the compliance rules.

NetFlow not only covers a substantial range of such indicators but never elbows the operator out of the loop unlike other security tools. Most security tools just detect and walk away but that's only half the story. NetFlow not only unfolds the complete path of the attack but also provides a remedial action.

During an event when compliance rules are hit everyone is freaked out. Work is a huge part of people's lives. Compliance laws make you responsible and you can't hide behind the story that my tool did not detect it; therefore I’m not liable when medical records went missing.

Meantime to innocence has no place to stand when compliance standards are not met. Conclusively, if you use old tools you need to tread carefully.