A Basic Guide to Understanding Data Loss Prevention

By CIOReview | Friday, June 24, 2016
666
1054
220

Prologue:

“He who beholds the data, beholds the power” is the mantra that businesses run on today. Sensitive data that can be regulated information or valuable intellectual property (IP) are today’s high octane fuel for the corporates and their exfiltration may very likely result in reputational or financial loss, or both. In the bygone decades, when data existed on paper, enforcing security policies was easier. Interestingly, as data took the form of “0s” and “1s”, allowing us to extract much more value from data, the reduction also made data loss or theft an easier task. It is now imperative for organizations to have forces in place that can prevent unintended data egress. Such solutions are called Data Loss Prevention (DLP) products.

Introduction to DLP

The first question that a CIO may ask before DLP adoption is “What is the need for a whole new solution when my organization already has a security solution in place?”. To fortify the need of DLP, examples of recent high-profile data breaches in powerhouses like Sony and JP Morgan Chase serve the purpose. Although security solutions like firewalls, unified threat management, intrusion detection/prevention systems can detect and/or check any threat to an organization, the recent examples highlight the deficiencies of such tools when it comes to data-specific approaches. This is where DLP’s dedicated data protection comes into the picture.

DLP products entered the market as tools to prevent accidental loss of sensitive data and did gather a lot of hype. But, serious challenges such as exorbitant costs and slow, complex deployment acted as potent inhibitors for the DLP space; with data centric security concerns on a high, DLP is all set to make its comeback. According to a 451 Research survey report DLP ranks second in terms of planned information security projects among organizations and data loss and data theft rank first in terms of security challenge for the near future. The same report cited the following trends in the DLP space:

• Growing need of compliance to data privacy regulations such as HIPAA and SOX.
• An increasing need to protect valuable IP and sensitive financial data.
• Cloud computing, growing midmarket penetration and DLP as a managed/hosted service are additional market drivers.

DLP products identify and secure sensitive data when the data is stored in persistent files (data at rest), is at transition within or across an organization’s network (data in motion) and/or is accessed by endpoint devices (data in use). In the wake of current data breaches the coverage of DLP has expanded from being a check against insider violations to include almost everything that relates to data exfiltration, with added benefits of being able to provide insight into the use of content within an enterprise.

As stated in the white paper titled “Understanding and Selecting a Data Loss Prevention Solution” by SANS Institute, DLP tools are available in the market as DLP as a feature and DLP as a solution. The difference between the two is that DLP features provide detection and enforcement capabilities of DLP solutions, but lack the dedicated task of content and data protection such as centralized management, policy creation, and enforcement workflow.

The Approach of DLP products

Contextual analysis of the content is important for determining the restrictions to be imposed on. The defining characteristic of DLP solutions is their ability to analyze the content of a data apart from their contextual awareness ability. While content awareness gives DLP the ability to analyze deep content using a variety of techniques, context analysis will include things such as source, destination, size, recipients, sender, header information, metadata, time, format, and anything else falling short of the content.

The previously stated SANS Institute white paper presents a clear picture of the mechanism behind content analysis. DLP solutions use file cracking to unpack a data package to read the information within, following which analysis techniques are used to identify any policy violations. The major analysis techniques include rule-based/regular expression, database fingerprinting, exact file matching, partial document matching, statistical analysis, conceptual/lexicon and categories.

Network DLP products vs Endpoint DLP products

DLP products currently available in the market are either network centric (nDLP) or endpoint centric (eDLP). nDLP solutions, also called “data in motion protection” reside within an organization’s network and monitor data during its transit. Employing the analyzing techniques when the nDLP product detects policy violations it automatically takes defined action such as blocking, notifying, encrypting or quarantine. As it is embedded in the network its integration and maintenance requires less overhead as compared to eDLP. However, it fails to enforce policies once the endpoint device leaves the corporate VPN.

On the other hand eDLP solutions reside in the endpoint devices itself and thus offer greater control. The endpoint agent checks the exfiltration of data from the device whenever the analysis mechanism detects policy violation. The greatest challenge with eDLP is that since it resides on endpoint devices, its installation and maintenance on all the devices require expansive overhead.

To decide which DLP camp to opt for CIOs need to balance the desired control over their data and thoroughness of data inspection desired against the time, effort and monetary investments. On ‘control over data’ eDLP wins over nDLP, but with much higher implementation and maintenance overhead. With greater thoroughness and ability to provide deeper insights eDLP looks the better option, but one has to consider that nDLP, although with lower outlay can be equally effective if supported with adequate data protection policies in place.

Selecting a DLP Product

Define your needs: After a CIO realizes the need for a DLP solution and much before hunting for vendors, the decision makers should define the data they want to protect, as specifically as possible. Typically an organization’s critical data fall under any of these personally identifiable information (social security numbers, contact details), corporate financial data and IP. Of these IP poses the greater challenge due to its less structured nature. Decision makers need to be aware that DLP products offer monitoring and/or prevention capability and vendors often use fancy names to conceal the capability. If prevention is the goal, it should be noted that enforcing prevention requires additional hardware and software that increase the financial burden. An organization should clearly question the vendor and become aware of such additional requirements because some of these technologies might already be available in the organization’s environment.

Maintenance overhead: nDLP offers the flexibility of centralized management and hence lesser maintenance overhead. But this comes at the expense of restricted control when compared to eDLP.

Don’t forget about integration: After all DLP solutions are vendor supplied and therefore require integration with an organization’s existing environment and not all vendors have the tools to address this issue. Even after selecting a great DLP product, organizations discover newer issues once the integration process in underway from where turning backwards is not healthy nor feasible. Since the DLP product will scan for sensitive data in different operating platforms, the concerned platforms and their compatibility with the DLP product should be studied beforehand.

The vendor is as important as the product: A vendor with ample market presence can be half of the solution an organization is looking for. A vendor who has been in the market for quite some time most probably has experienced and dealt with problems that may arise while incorporating DLP within a new environment such as implementation issues. Such learnings can reduce a lot of effort and resources utilized in the process. Chances are, vendors with sufficient market presence have already served other organizations in industries same as an organization approaching the vendor. This might prove beneficial during policy creation, which is the core of this technology.

Additional staffing: DLP is in an adolescent stage where it is difficult to clearly predict the additional workforce it can create and the dedicated staff to handle that work. Those who can’t afford in-house dedicated staff might need to outsource the additional work. In such an event the total cost of ownership can very likely inflate beyond what was foreseen during the initial stages.

Internal testing: Before the DLP becomes a part of the organization this is the last chance to detect and iron out problems in the selection process. There are no specific bullet points that can point out which aspects to test. The in-house testing should be as thorough as possible and the product should be exposed to the exact environment and processes that it is likely to face in the future once it is implemented.

DLP products can prove beneficial for organizations that understand the technology well to take full advantage of them. However, it is still some distance away from becoming the ultimate solution against data breaches, especially the ones executed by attackers who understand the technology. All the considerations regarding the business processes and units to be under the DLP radar should be well carved out before embarking on the journey. Post-implementation would be a bad time to realize that some particular unit or process handling sensitive were missed out or are immune to DLP’s remedy.