DLP Should Be A Help Not A Hindrance

Data loss prevention doesn’t have to be an “all-up” approach. Sometimes, it’s best to start with the simple things, says PJ Connolly

Continued from page 1

Further stages of DLP Implementation are where things can become complicated. For example, addressing DLP in email seems relatively straightforward, because of the nature of the medium. Many products that address other email security threats also offer some DLP functionality – in a fashion that Mogull refers to as “DLP Light” – and if one chooses to bring a dedicated system to insert another mail transfer agent that provides a DLP layer, it’s unlikely to be noticed by users. The downside to such a solution is that it may cover one’s external email traffic well, but leave internal traffic unprotected.

A similar situation can be seen with network-based DLP. DLP products will often work with existing reverse proxy features of an Internet gateway to inspect SSL-encrypted traffic. In a recent report from Palo Alto Networks, sampled organisations in the United States showed that 20.7 percent of bandwidth consisted of SSL, on port 443 or other ports. The same traffic analysis showed that one or more implementations of the Tor onion router were running on 15 percent of survey networks worldwide. The most a DLP solution can do for such traffic is to flag it or block it altogether, without actually identifying what the traffic consists of.

In storage, Mogull and Securosis are observing less “DLP Light” but better integration, thanks in part to the ability to tap into databases and document management systems. Because of the nature of these systems, real-time DLP monitoring is often limited to filterlike techniques – categories, patterns and rules – because anything deeper can present an obstacle to achieving optimal system performance.

Help or hindrance?

Whatever one chooses as part of a DLP strategy, it’s important to make sure that it offers a clean user interface and solid reporting tools. Although those may seem like obvious criteria, Mogull observed that DLP tools are sometimes so engineering-driven that their designers forget that the users of the tools – who may not work in IT at all – need a simple and efficient way to address potential problems. After all, there will be occasions when an immediate response is needed, and the interface should be a help rather than a hindrance.

An area of DLP that isn’t often discussed is what to do when data appears to have been leaked. All too often, these efforts, which are reactive by their very nature, take on the aspects of a witch hunt. These can often do more damage than the actual data loss, by virtue of their effect on the morale of the organisation and its customers and partners. That’s why it’s important to keep Mogull’s point about intent in mind, or to paraphrase a common saying, “don’t assume malice when simple carelessness will suffice.”

One thing that should give DLP implementers hope, according to Mogull, is that the market is starting to mature, even as the technology remains ahead of adoption. Arguably, the hardest thing for IT and security managers to cope with today is making room in their budgets for tools that are appropriate for their organisations, whether that’s viewed from a threat perspective or from the available skill sets within the company. Of course, that’s one problem that almost never goes away. At least, until it’s too late.