advertisement
advertisement

This is page 2 of:

Fighting Words: Why PCI’s Token Group Blew Up

August 17th, 2011

Ulf Mattsson, the CTO at Protegrity and another participant in the PCI process, said the issue of taking something out of scope and putting things back in scope is critical, and that retailers need to know concretely what will happen if they take specific actions. “When you push out the tokens in the dataflow, there is no way back,” he said. “It’s in the bloodstream of your enterprise.”

Mattsson, like many others, expressed disappointment more for what was not ultimately covered in the guidelines than for what was.

“It took too long to get this supplement out. I have seen so much material that could have been part of this supplement,” he said, citing one example of the need to now revise the self-assessment form. “We need to add questions that assess the tokenization area. It’s a gaping hole.”

Another area of concern for Mattsson is the token training that QSAs will have to go through. “If we had a more detailed guidance document, then less training might be needed. And I would like to see a roadmap for future validation, an independent technology validation.”

Mattsson also described the token group’s reorganization and how that delayed the guidance. “I think it escalated when we tried to finalize the document. One vendor pushed its encryption and we spent a lot of time fighting about that,” he said.

Merchant Link’s Zloth said PCI Council management opted to reorganize the group—officially a PCI SIG—by adding more PCI staff people. “The Council decided they were going to put resources on it. Definitely there was a shift from the participating organization-led group to the PCI Council-led group,” she said. “It was a shift of people and a shift in what the document ended up looking like.”

What was the nature of that shift? “The earlier draft was a combination of a guidance document and a validation document,” Zloth said, adding that the final version was solely a guidance document. A validation document would typically trigger a validation program, which would involve training, lab work and other items.

The PCI Council was “not prepared to create a whole validation program for tokenization,” she said, especially given the fact that PCI is just about to release the full validation treatment for point-to-point encryption.

Although the compromises were needed to get the document published, a more specific report could have been helpful, Protegrity’s Mattsson said.

“My view is that the Council and some card brands are very nervous about new and unknown risks that tokenization may introduce. They point to the QSA, but there is no training or check list available or announced to help the QSA,” he said.

The CEO of another tokenization vendor, Shift4, touched on the inherent conflicts in any association such as PCI.

“The greatest balancing act for the PCI Council is not a product of balancing different merchant types or even balancing giant retail chains with mom-and-pop shops. Rather, their biggest roadblock is their apparent desire to satisfy (or balance) all of the needs of the various third parties that have a product to sell,” said J. David Oder. “There is a lot of money in security solutions and the pressure on a standards body is immense.

“A weak standard is one thing, but the caveat that merchants and their banks should consult their QSA is indicative of the weakness of PCI. What good is a standard if it has been delegated to 100 different organizations with 100 different opinions, especially when some of the opinions are so obviously swayed by the fact that they sell their own tokenization or end-to-end encryption products?” Oder said. “The QSA community was created by the PCI Council. Yet, unlike many other standards bodies, the Council has not yet created a prohibition on conflict of interest for QSAs. They likewise have no mechanism to arbitrate conflicts between merchants and QSAs, between QSAs, or between suppliers and QSAs.”


advertisement

3 Comments | Read Fighting Words: Why PCI’s Token Group Blew Up

  1. Marc Massar Says:

    Tokenization of data can be accomplished in many ways. Not all security vendors approach the tokenization use case in the same way. The disagreements sometimes focused on the exclusion of one method of tokenization in favor of another, while most of the true security vendors wanted to make sure that all valid and secure use cases were allowed. The Council’s goal is not to endorse one vendor or technology over another and while the tokenization guidance document isn’t perfect it does provide the right amount of guidance for QSAs and acquirers to move forward with scoping decisions as it relates to PCI DSS and tokenization.

  2. Mark Bower Says:

    I agree with Marc – it’s important to recognize all the hard work that was put into the creation of the document which is a fair and reasonable representation of how a particular technology can be applied to help manage part of an overall risk problem – including inherent risks in tokenization itself in all its forms.

    There’s no silver bullet for compliance and risk reduction – and the truth is that many players had been claiming near-utopian scope reduction in the absence of actual security proofs, hard evidence, and without any statements from the PCI SSC – creating a widening gap between expectation and practice. So now a very reasonable and fair document has been published, there should be no surprises that some prior claims don’t match the reality.

    All things considered, the working groups and task forces were cordial engagements with a common goal – and despite competitive interests the outcome is a positive and useful one that the industry needed. I say thanks to all who contributed many hours of effort, and look forward to future guidance in a similar vein.

  3. Steve Sommers Says:

    The original definition of tokenization was that a token was not mathematically related to the data it was protecting — the PAN. When the definition was changed in the SIG to accommodate various vendor solutions that didn’t follow this definition, that is where the security holes were introduced. Some vendors were marketing “tokenization” even though their solutions were not using true tokens and were instead using encryption and hash techniques to generate “tokens” — which in turn changed scoping aspects of the concept. To plug the security holes introduced by this definition change, the document simply offloads scoping concerns off to the acquirers and Card Brands, which will in turn refer merchants to QSA’s. In my opinion they took a simple concept that was secure, changed the definition, and then added a “use at your own risk” clause.

Newsletters

StorefrontBacktalk delivers the latest retail technology news & analysis. Join more than 60,000 retail IT leaders who subscribe to our free weekly email. Sign up today!
advertisement

Most Recent Comments

Why Did Gonzales Hackers Like European Cards So Much Better?

I am still unclear about the core point here-- why higher value of European cards. Supply and demand, yes, makes sense. But the fact that the cards were chip and pin (EMV) should make them less valuable because that demonstrably reduces the ability to use them fraudulently. Did the author mean that the chip and pin cards could be used in a country where EMV is not implemented--the US--and this mis-match make it easier to us them since the issuing banks may not have as robust anti-fraud controls as non-EMV banks because they assumed EMV would do the fraud prevention for them Read more...
Two possible reasons that I can think of and have seen in the past - 1) Cards issued by European banks when used online cross border don't usually support AVS checks. So, when a European card is used with a billing address that's in the US, an ecom merchant wouldn't necessarily know that the shipping zip code doesn't match the billing code. 2) Also, in offline chip countries the card determines whether or not a transaction is approved, not the issuer. In my experience, European issuers haven't developed the same checks on authorization requests as US issuers. So, these cards might be more valuable because they are more likely to get approved. Read more...
A smart card slot in terminals doesn't mean there is a reader or that the reader is activated. Then, activated reader or not, the U.S. processors don't have apps certified or ready to load into those terminals to accept and process smart card transactions just yet. Don't get your card(t) before the terminal (horse). Read more...
The marketplace does speak. More fraud capacity translates to higher value for the stolen data. Because nearly 100% of all US transactions are authorized online in real time, we have less fraud regardless of whether the card is Magstripe only or chip and PIn. Hence, $10 prices for US cards vs $25 for the European counterparts. Read more...
@David True. The European cards have both an EMV chip AND a mag stripe. Europeans may generally use the chip for their transactions, but the insecure stripe remains vulnerable to skimming, whether it be from a false front on an ATM or a dishonest waiter with a handheld skimmer. If their stripe is skimmed, the track data can still be cloned and used fraudulently in the United States. If European banks only detect fraud from 9-5 GMT, that might explain why American criminals prefer them over American bank issued cards, who have fraud detection in place 24x7. Read more...

StorefrontBacktalk
Our apologies. Due to legal and security copyright issues, we can't facilitate the printing of Premium Content. If you absolutely need a hard copy, please contact customer service.