New PCI Edict: Tokens Can Be Out-Of-Scope
Written by Evan SchumanThe PCI Council on Friday (Aug. 12) will, for the first time, offer guidance on tokenization—guidance telling retailers most of their systems can, indeed, be considered out of PCI scope if they properly use tokens. But the Council stressed that if the token is ever reversed into card data on the retailer’s systems, everything is fully back in scope.
“There has to be recognition by the merchant that reversing that [token] and being able to again see the primary account number (PAN) and be able to use and execute against account data brings those systems back into scope,” said PCI Council Chief Technology Officer Troy Leach.
Leach added that the biggest token-related mistake he’s seen retailers make is enabling the data to be de-tokenized during chargebacks, refunds or for loyalty tracking. “That’s probably the oversight we see most often: not recognizing back channels where the merchant still may receive that account information.”
“To be considered out of scope for PCI DSS, both the tokens and the systems they reside on would need to have no value to an attacker attempting to retrieve PAN, nor should they in any way be able to influence the security of cardholder data or the cardholder data environment,” the guidelines say.
The new details also reinforce that authentication data—such as magnetic stripe data or its equivalent on a chip, CAV2/CVC2/CVV2/CID data and PINs/PIN blocks—as opposed to customer data, cannot be tokenized because it would violate PCI DSS Requirement 3.2.
The Council is pushing retailers to combine token efforts with point-to-point encryption, a combination that could temporarily enable retailers to safely experiment with mobile-payment systems and still be PCI compliant.
Many of the new guidelines should not surprise veterans of tokenization. For example, an emphasis on the need for all of the systems associated with tokenization—including data vault, segmentation, monitoring systems, cryptographic key storage and token mapping—to be fully secured. “PCI DSS requirements are not completely eliminated,” Leach said. “There are elements that still would need to be validated.”
Council officials also pointed to the radically different way various vendors handle tokenization and to the fact that, logically enough, the rules should be different for each. “There is really no standard for any of these solutions at this point,” said Bob Russo, the Council’s general manager. “For the token to be considered out-of-scope in PCI DSS, the token’s got to be unusable if the system it resides on is compromised.” Added Leach: “Not all algorithms are created equal.”
August 12th, 2011 at 8:39 pm
We have several issues with the Tokenization Guideline as published by PCI SSC. Basically they took a simple concept that helped merchants with security and compliance, added some lard, and now the “simple concept” allows for “valuable tokens”, opening security holes and complicating compliance. Not good. On page 20, we find this little gem: “Additionally, tokens that can be used to initiate a transaction might be in scope for PCI DSS.” Might? Wasn’t the whole purpose of this document to take what “might” be true and determine what really is true? What was released today was not an industry standard, and it was not a guideline. It was an eloquently worded, poorly veiled passing of the buck from the PCI SSC to individual acquirers and QSAs.
August 15th, 2011 at 1:41 pm
Well, I don’t think I would speak as harshly about the guidelines as Steve. I think they are a good first step. However, I do have an issue with the last section, which as it is currently written will introduce a lot of fear, uncertainty and doubt into many merchant’s minds regarding how to keep the systems they have which are storing only tokens out of scope. For solutions which support these types of tokens, the guidelines state that there must be additional controls in place to detect and prevent fraudulent transactions. This is where I feel the Council’s document fell short…when they introduced this concept that tokens may potentially be back in scope without providing guidance as to how to keep them out of scope.