Fighting Words: Why PCI’s Token Group Blew Up
Written by Evan SchumanThe PCI group working on the Council’s tokenization policy got so embroiled in infighting that it had to be restructured by the Council, which mostly explains the year-long delay in the Council’s tokenization policy. But this is hardly surprising, given that the group is mostly employees of competing security vendors, almost all of whom are trying to skew the Council’s policy to benefit that vendor’s approach.
The token guidance that the group and the Council staff ultimately released on August 12 was a classic compromise, in that it pleased few in the group but was a huge step forward for retail security. The purpose of such guidelines is not to get so specific that it benefits any vendor, but to give instructions to QSAs and retailers on what is permissible and what isn’t.
Reaction to the document has fallen into these two camps: one saying that the document was not nearly specific enough and is passing the buck to QSAs, and the other saying that this guidance represents major progress and that it should be applauded. Unlike some political camps, both sides here seem to readily acknowledge the other position, with the “not nearly far enough” crowd generally agreeing that it indeed is a lot better than what was there before (which was essentially nothing), and the “it’s a big step forward” crowd agreeing that more specifics would have been helpful.
It’s that last point where corporate politics comes into play. Almost all agree that more specifics would have been better. But when it comes to deciding which specifics, well, that’s where the technical arguments devolved into impasses.
The key takeaway of the PCI guidance was that tokens—done properly—could be considered out of scope. On the concerns side, some participants were unhappy that the PCI Council opted to say that tokens that can be used directly as payment devices, which it dubbed high-value tokens, might still be in scope.
Token vendor Shift4 quickly posted on its Web site a very pointed attack of what the Council published, and it seemed particularly unhappy with the high-value token suggestions.
“On page 20, we find this little gem: ‘Additionally, tokens that can be used to initiate a transaction might be in scope for PCI DSS.’ Might? Wasn’t the whole purpose of this document to take what ‘might’ be true and determine what really is true?” Shift4 asked.
Another participant in the talks, Merchant Link Product Group Manager Sue Zloth, also published concerns about the high-value token reference, saying that “I do have an issue” with it. “As it is currently written, [it] will introduce a lot of fear, uncertainty and doubt into many merchants’ minds regarding how to keep the systems they have, which are storing only tokens out of scope. For solutions that support these types of tokens, the guidelines state that there must be additional controls in place to detect and prevent fraudulent transactions. This is where I feel the Council’s document fell short. They introduced this concept that tokens may potentially be back in scope without providing guidance as to how to keep them out of scope.”
StorefrontBacktalk‘s own PCI columnist, Walter Conway (whose day job is being a QSA), also zeroed in on the high-value token reference and the Council’s conclusion that it may have to stay in scope.
“This conclusion likely comes as a surprise to many retailers and other merchants (along with their tokenization providers). For example, E-Commerce merchants who use tokens for one-click ordering and repeat purchases (leaving the underlying PANs with their processor or another third party) just learned their tokens will still be in scope for PCI,” Conway wrote in this week’s column. “I wonder if the hotel room keycard (or resort account) I use to charge meals is a high-value token because it generates a payment-card transaction? I even wonder if tokens used for exception-item-processing such as chargebacks and refunds are high-value tokens because they impact (even if it is to reverse) transactions?”
August 18th, 2011 at 4:04 pm
Tokenization of data can be accomplished in many ways. Not all security vendors approach the tokenization use case in the same way. The disagreements sometimes focused on the exclusion of one method of tokenization in favor of another, while most of the true security vendors wanted to make sure that all valid and secure use cases were allowed. The Council’s goal is not to endorse one vendor or technology over another and while the tokenization guidance document isn’t perfect it does provide the right amount of guidance for QSAs and acquirers to move forward with scoping decisions as it relates to PCI DSS and tokenization.
August 19th, 2011 at 8:09 pm
I agree with Marc – it’s important to recognize all the hard work that was put into the creation of the document which is a fair and reasonable representation of how a particular technology can be applied to help manage part of an overall risk problem – including inherent risks in tokenization itself in all its forms.
There’s no silver bullet for compliance and risk reduction – and the truth is that many players had been claiming near-utopian scope reduction in the absence of actual security proofs, hard evidence, and without any statements from the PCI SSC – creating a widening gap between expectation and practice. So now a very reasonable and fair document has been published, there should be no surprises that some prior claims don’t match the reality.
All things considered, the working groups and task forces were cordial engagements with a common goal – and despite competitive interests the outcome is a positive and useful one that the industry needed. I say thanks to all who contributed many hours of effort, and look forward to future guidance in a similar vein.
August 22nd, 2011 at 11:31 am
The original definition of tokenization was that a token was not mathematically related to the data it was protecting — the PAN. When the definition was changed in the SIG to accommodate various vendor solutions that didn’t follow this definition, that is where the security holes were introduced. Some vendors were marketing “tokenization” even though their solutions were not using true tokens and were instead using encryption and hash techniques to generate “tokens” — which in turn changed scoping aspects of the concept. To plug the security holes introduced by this definition change, the document simply offloads scoping concerns off to the acquirers and Card Brands, which will in turn refer merchants to QSA’s. In my opinion they took a simple concept that was secure, changed the definition, and then added a “use at your own risk” clause.