The Dangerous Out-Of-Scope PCI Charade
Written by Evan SchumanDominating many discussions over the last few weeks in payment security circles has been speculation over what the PCI Council, Visa and others will decide about declaring some types of data out-of-scope for PCI purposes. Getting much less attention is what IT execs should do with data that is declared out-of-scope and how dangerous a game out-of-scope is.
At its simplest, out-of-scope means beyond jurisdiction; it means that whatever is being discussed no longer falls under the rules and requirements of PCI. One critical problem is that the brands and the PCI Council giveth and they can taketh away. In other words, if you’ve started sharing some, for example, tokenized data with marketing because a temporary out-of-scope ruling makes you comfortable doing so, you may find it almost impossible to undo should that ruling be reversed. Put more philosophically, you won’t likely be able to get the clear-text toothpaste back into the “they’re going to fine me from here to Shanghai, aren’t they?” tube.
The safest route is to somehow identify things that are declared temporarily out-of-scope from those that are permanently out-of-scope. But nothing would likely ever be declared temporary, so that’s rather useless advice. The only wise route is to simply assume that everything declared out-of-scope could later be declared back in-scope.
Standards change, and nothing changes faster than security standards. “We all thought WEP was cool until the data security standard changed,” said Walter Conway, a QSA with 403 Labs.
What’s just about as dangerous as being cavalier with data that may be only temporarily out-of-scope is reading too much into the vague comments coming from various card brands and the PCI Council. Statements have been made hinting that some technologies may be considered out-of-scope.
“If it’s potentially out of scope, I think you’re mad to consider it out-of-scope,” Conway said. “I think you’re juggling razor blades if you treat it as out-of-scope data for PCI purposes.”
Speaking of juggling razor blades, why would IT want to treat data differently if it’s out-of-scope? Just because PCI may not—for the moment—care about it doesn’t mean that various kinds of bad guys might not care quite a bit.
Let’s say you now share those PANs with someone in marketing. And they copy it onto a thumb drive or print it out and stick in a bag to take home.
If out-of-scope doesn’t mean it’s OK to shed security rules, what good is it? The only practical advantage is a cost savings on PCI assessments, to the extent that the newly out-of-scope data represents a material portion of the overall data you need protected.
Tokens are a common target for out-of-scope, but Conway argues that it could prove to be a reckless move. (Tokenization is fine, but assuming it’s out-of-scope could be reckless.) Why? Because anything that could be made unreadable can, in various ways, be made readable again. “Key management is only one way to make something that’s out-of-scope in-scope again. What if someone breaks into the secure vault and steals the key and then comes into your network?”
Another Conway scenario: “The IT staff just picked up a rumor that they’re being laid off and they call and say, ‘Give me these 100,000 tokens for testing.’ You have to ask yourself: My controls, how effective are they?”
Out-of-scope declarations are a great concept, and if they happen and can be used to lower your assessment costs, that’s wonderful. But if you start treating data as out-of-scope, you may find that out-of-scope could drive you out-of-mind.
November 12th, 2009 at 12:59 pm
The out-of-scope argument is very valid but in reference to tokens, the premise of temporarily out-of-scope or abruptly deemed in-scope if flawed. Conway was quoted “anything that could be made unreadable can, in various ways, be made readable again,” this statement is true when talking about encryption technologies (all encryption technologies) but not so with true tokens. True tokens are in no way related to the original data other than as a reference key. The token, or reference key, can simply be a sequential number stamped when the PAN is secured, can simply be a timestamp to 16 or more digits of precision, or it can be based on a random number or any other non-PAN related factor – the key here is that the PAN is not used to create the token. If anyone ever cracks how to “decrypt” a true token, then there will be more money to be made elsewhere because they will have determined how to predict past and future events. I know I would crack a couple of lotteries and you would never hear from me again. ;-)
November 12th, 2009 at 1:48 pm
Good general point, Steve, but for the record, not all tokenization is done the same way. Many tokens are associated with lookup lists that allow for them to re-matched to the card data if it’s needed, such as for a chargeback. A token doesn’t have to be decryptable (is that a word?) for there to be a way to access the original data. At least that’s the case with some tokens.
November 12th, 2009 at 2:48 pm
True, that someone may be storing a token-to-PAN cross reference. But that would be the bank, not the retailer. If the bank is not sure they can keep their data secure, then there are bigger problems to be addressed than bringing tokens into scope.
November 12th, 2009 at 2:54 pm
But the consumer walks into a particular retail chain, gives their payment card to someone wearing that chain’s uniform and the card is swiped. If, six months later, there’s a breach and that card was misused, it’s the retailer who will in the spotlight. They’re the deep pocket and, therefore, the target.
If the consumer is angry and wants to cut off business, it will hit the retailer. Therefore, if the retailer is going to end up being blamed no matter what, they have to stay involved.
November 12th, 2009 at 8:27 pm
Tokenization and End to End Encryption each have their benefits and use cases in PCI DSS scope reduction.
However, having the ability to do both Tokenization and End to End Encryption (not mere point to point) can have tremendous scope and risk reduction benefits and agility to adapt to change in this fast moving compliance landscape. Being able to have both on tap from a single platform is a solid approach to avoiding the pitfalls Evan highlights in this article which brings a fresh light to a hot topic – de-scoping.
In terms of the temporarily out of scope question: for tokens to be useful they must be “de-identified” back to the live data at some point in to permit functions like charge-back, and other in-house PAN dependent functions. For smaller level 4 merchants there may be less of a need to de-identify the token back to a PAN in house. However for Level 1 and 2 merchants there’s a myriad of interdependent in-house systems that will continue to need the PAN even if temporarily. It’s also not uncommon to find large merchants with established relationships with multiple processors – one for Credit, another for pre-paid, another for co-branded Gift cards etc. So, sadly it’s not a simple a case of “removing all PAN data” or moving to Tokenization services; charge-backs are a good example where PAN may be needed, but in-store credit, fraud and velocity checking and other Tier 1 everyday processes are often PAN-dependent and won’t go away any time soon. Indeed, many of the functions are fraud countermeasures that save merchant millions a year are PAN-dependent and cannot easily change.
As the article points out – one cannot be reckless in trying to avoid scope as the PCI landscape is changing very quickly. End to End Encryption from the POS is emerging as a leading way to reduce scope for merchants as noted in PwC’s summation of their findings – and End to End Encryption implementations like ours have come with formal cryptographic security proofs – assurance of security in protecting data to the strong cryptography requirements of PCI DSS and beyond.
On the other hand, Tokenization implementations vary substantially per Evans comments above; as a result there has to be additional consideration – where are the formal proofs of security vs mere assertions from vendors, where is the evidence that it is strong and proven? For Tokenization that’s a difficult challenge since there’s no standard to hold up to evaluate to a level of assurance. I’ve yet to see this kind of transparency which is a critical aspect of security best practice – security by obscurity doesn’t fly. That’s in direct contrast to Encryption and Key Management which has 30+ years of rigor and best practice.
Besides, any Tokenization system must use encryption on the back end to protect the PANs in that mapping system, and data capture or transport encryption on the front end for any offline cases and to communicate to the token service or system – so why not simply encrypt it end to end the first moment ?
Best Regards,
Mark Bower
VP Product Management
Voltage Security
November 12th, 2009 at 10:43 pm
Most reference to scoping have to do with PCI compliance, not legal liability. Legal liability, for the most part, has one scope — the person with the deepest pockets is liable for everything.
I agree fully agree with your statement “not all tokenization is done the same way,” and that’s why I have been active lately defending “true” tokenization. With the company I represent, there is no way to lookup the token to get the card number — tokens can be used by the merchant for processing transactions, chargeback defense, posting refunds or future payments, but not downloaded.
I would argue that if tokens were somehow used in a breach, its not the failure of the token (or any application that used the token) but instead a failure of the tokenization solution. Basically I’m saying that I can’t see any reason or legal basis for a “true” token to be within PCI scope. If tokens are ever deemed in-scope, then where does the line stop? I ask this because it would mean that all timestamps, sequential number, random numbers or any other piece of information that may or may not be used to generate a token is within scope — all data a POS uses and stores, not just payment data.
Sorry for any typos, it’s late and I’m on my way out the door for a long weekend.