advertisement
advertisement

Fighting Words: Why PCI’s Token Group Blew Up

Written by Evan Schuman
August 17th, 2011

The PCI group working on the Council’s tokenization policy got so embroiled in infighting that it had to be restructured by the Council, which mostly explains the year-long delay in the Council’s tokenization policy. But this is hardly surprising, given that the group is mostly employees of competing security vendors, almost all of whom are trying to skew the Council’s policy to benefit that vendor’s approach.

The token guidance that the group and the Council staff ultimately released on August 12 was a classic compromise, in that it pleased few in the group but was a huge step forward for retail security. The purpose of such guidelines is not to get so specific that it benefits any vendor, but to give instructions to QSAs and retailers on what is permissible and what isn’t.

Reaction to the document has fallen into these two camps: one saying that the document was not nearly specific enough and is passing the buck to QSAs, and the other saying that this guidance represents major progress and that it should be applauded. Unlike some political camps, both sides here seem to readily acknowledge the other position, with the “not nearly far enough” crowd generally agreeing that it indeed is a lot better than what was there before (which was essentially nothing), and the “it’s a big step forward” crowd agreeing that more specifics would have been helpful.

It’s that last point where corporate politics comes into play. Almost all agree that more specifics would have been better. But when it comes to deciding which specifics, well, that’s where the technical arguments devolved into impasses.

The key takeaway of the PCI guidance was that tokens—done properly—could be considered out of scope. On the concerns side, some participants were unhappy that the PCI Council opted to say that tokens that can be used directly as payment devices, which it dubbed high-value tokens, might still be in scope.

Token vendor Shift4 quickly posted on its Web site a very pointed attack of what the Council published, and it seemed particularly unhappy with the high-value token suggestions.

“On page 20, we find this little gem: ‘Additionally, tokens that can be used to initiate a transaction might be in scope for PCI DSS.’ Might? Wasn’t the whole purpose of this document to take what ‘might’ be true and determine what really is true?” Shift4 asked.

Another participant in the talks, Merchant Link Product Group Manager Sue Zloth, also published concerns about the high-value token reference, saying that “I do have an issue” with it. “As it is currently written, [it] will introduce a lot of fear, uncertainty and doubt into many merchants’ minds regarding how to keep the systems they have, which are storing only tokens out of scope. For solutions that support these types of tokens, the guidelines state that there must be additional controls in place to detect and prevent fraudulent transactions. This is where I feel the Council’s document fell short. They introduced this concept that tokens may potentially be back in scope without providing guidance as to how to keep them out of scope.”

StorefrontBacktalk‘s own PCI columnist, Walter Conway (whose day job is being a QSA), also zeroed in on the high-value token reference and the Council’s conclusion that it may have to stay in scope.

“This conclusion likely comes as a surprise to many retailers and other merchants (along with their tokenization providers). For example, E-Commerce merchants who use tokens for one-click ordering and repeat purchases (leaving the underlying PANs with their processor or another third party) just learned their tokens will still be in scope for PCI,” Conway wrote in this week’s column. “I wonder if the hotel room keycard (or resort account) I use to charge meals is a high-value token because it generates a payment-card transaction? I even wonder if tokens used for exception-item-processing such as chargebacks and refunds are high-value tokens because they impact (even if it is to reverse) transactions?”


advertisement

3 Comments | Read Fighting Words: Why PCI’s Token Group Blew Up

  1. Marc Massar Says:

    Tokenization of data can be accomplished in many ways. Not all security vendors approach the tokenization use case in the same way. The disagreements sometimes focused on the exclusion of one method of tokenization in favor of another, while most of the true security vendors wanted to make sure that all valid and secure use cases were allowed. The Council’s goal is not to endorse one vendor or technology over another and while the tokenization guidance document isn’t perfect it does provide the right amount of guidance for QSAs and acquirers to move forward with scoping decisions as it relates to PCI DSS and tokenization.

  2. Mark Bower Says:

    I agree with Marc – it’s important to recognize all the hard work that was put into the creation of the document which is a fair and reasonable representation of how a particular technology can be applied to help manage part of an overall risk problem – including inherent risks in tokenization itself in all its forms.

    There’s no silver bullet for compliance and risk reduction – and the truth is that many players had been claiming near-utopian scope reduction in the absence of actual security proofs, hard evidence, and without any statements from the PCI SSC – creating a widening gap between expectation and practice. So now a very reasonable and fair document has been published, there should be no surprises that some prior claims don’t match the reality.

    All things considered, the working groups and task forces were cordial engagements with a common goal – and despite competitive interests the outcome is a positive and useful one that the industry needed. I say thanks to all who contributed many hours of effort, and look forward to future guidance in a similar vein.

  3. Steve Sommers Says:

    The original definition of tokenization was that a token was not mathematically related to the data it was protecting — the PAN. When the definition was changed in the SIG to accommodate various vendor solutions that didn’t follow this definition, that is where the security holes were introduced. Some vendors were marketing “tokenization” even though their solutions were not using true tokens and were instead using encryption and hash techniques to generate “tokens” — which in turn changed scoping aspects of the concept. To plug the security holes introduced by this definition change, the document simply offloads scoping concerns off to the acquirers and Card Brands, which will in turn refer merchants to QSA’s. In my opinion they took a simple concept that was secure, changed the definition, and then added a “use at your own risk” clause.

Newsletters

StorefrontBacktalk delivers the latest retail technology news & analysis. Join more than 60,000 retail IT leaders who subscribe to our free weekly email. Sign up today!
advertisement

Most Recent Comments

Why Did Gonzales Hackers Like European Cards So Much Better?

I am still unclear about the core point here-- why higher value of European cards. Supply and demand, yes, makes sense. But the fact that the cards were chip and pin (EMV) should make them less valuable because that demonstrably reduces the ability to use them fraudulently. Did the author mean that the chip and pin cards could be used in a country where EMV is not implemented--the US--and this mis-match make it easier to us them since the issuing banks may not have as robust anti-fraud controls as non-EMV banks because they assumed EMV would do the fraud prevention for them Read more...
Two possible reasons that I can think of and have seen in the past - 1) Cards issued by European banks when used online cross border don't usually support AVS checks. So, when a European card is used with a billing address that's in the US, an ecom merchant wouldn't necessarily know that the shipping zip code doesn't match the billing code. 2) Also, in offline chip countries the card determines whether or not a transaction is approved, not the issuer. In my experience, European issuers haven't developed the same checks on authorization requests as US issuers. So, these cards might be more valuable because they are more likely to get approved. Read more...
A smart card slot in terminals doesn't mean there is a reader or that the reader is activated. Then, activated reader or not, the U.S. processors don't have apps certified or ready to load into those terminals to accept and process smart card transactions just yet. Don't get your card(t) before the terminal (horse). Read more...
The marketplace does speak. More fraud capacity translates to higher value for the stolen data. Because nearly 100% of all US transactions are authorized online in real time, we have less fraud regardless of whether the card is Magstripe only or chip and PIn. Hence, $10 prices for US cards vs $25 for the European counterparts. Read more...
@David True. The European cards have both an EMV chip AND a mag stripe. Europeans may generally use the chip for their transactions, but the insecure stripe remains vulnerable to skimming, whether it be from a false front on an ATM or a dishonest waiter with a handheld skimmer. If their stripe is skimmed, the track data can still be cloned and used fraudulently in the United States. If European banks only detect fraud from 9-5 GMT, that might explain why American criminals prefer them over American bank issued cards, who have fraud detection in place 24x7. Read more...

StorefrontBacktalk
Our apologies. Due to legal and security copyright issues, we can't facilitate the printing of Premium Content. If you absolutely need a hard copy, please contact customer service.