This is page 2 of:
Fighting Words: Why PCI’s Token Group Blew Up
Ulf Mattsson, the CTO at Protegrity and another participant in the PCI process, said the issue of taking something out of scope and putting things back in scope is critical, and that retailers need to know concretely what will happen if they take specific actions. “When you push out the tokens in the dataflow, there is no way back,” he said. “It’s in the bloodstream of your enterprise.”
Mattsson, like many others, expressed disappointment more for what was not ultimately covered in the guidelines than for what was.
“It took too long to get this supplement out. I have seen so much material that could have been part of this supplement,” he said, citing one example of the need to now revise the self-assessment form. “We need to add questions that assess the tokenization area. It’s a gaping hole.”
Another area of concern for Mattsson is the token training that QSAs will have to go through. “If we had a more detailed guidance document, then less training might be needed. And I would like to see a roadmap for future validation, an independent technology validation.”
Mattsson also described the token group’s reorganization and how that delayed the guidance. “I think it escalated when we tried to finalize the document. One vendor pushed its encryption and we spent a lot of time fighting about that,” he said.
Merchant Link’s Zloth said PCI Council management opted to reorganize the group—officially a PCI SIG—by adding more PCI staff people. “The Council decided they were going to put resources on it. Definitely there was a shift from the participating organization-led group to the PCI Council-led group,” she said. “It was a shift of people and a shift in what the document ended up looking like.”
What was the nature of that shift? “The earlier draft was a combination of a guidance document and a validation document,” Zloth said, adding that the final version was solely a guidance document. A validation document would typically trigger a validation program, which would involve training, lab work and other items.
The PCI Council was “not prepared to create a whole validation program for tokenization,” she said, especially given the fact that PCI is just about to release the full validation treatment for point-to-point encryption.
Although the compromises were needed to get the document published, a more specific report could have been helpful, Protegrity’s Mattsson said.
“My view is that the Council and some card brands are very nervous about new and unknown risks that tokenization may introduce. They point to the QSA, but there is no training or check list available or announced to help the QSA,” he said.
The CEO of another tokenization vendor, Shift4, touched on the inherent conflicts in any association such as PCI.
“The greatest balancing act for the PCI Council is not a product of balancing different merchant types or even balancing giant retail chains with mom-and-pop shops. Rather, their biggest roadblock is their apparent desire to satisfy (or balance) all of the needs of the various third parties that have a product to sell,” said J. David Oder. “There is a lot of money in security solutions and the pressure on a standards body is immense.
“A weak standard is one thing, but the caveat that merchants and their banks should consult their QSA is indicative of the weakness of PCI. What good is a standard if it has been delegated to 100 different organizations with 100 different opinions, especially when some of the opinions are so obviously swayed by the fact that they sell their own tokenization or end-to-end encryption products?” Oder said. “The QSA community was created by the PCI Council. Yet, unlike many other standards bodies, the Council has not yet created a prohibition on conflict of interest for QSAs. They likewise have no mechanism to arbitrate conflicts between merchants and QSAs, between QSAs, or between suppliers and QSAs.”
August 18th, 2011 at 4:04 pm
Tokenization of data can be accomplished in many ways. Not all security vendors approach the tokenization use case in the same way. The disagreements sometimes focused on the exclusion of one method of tokenization in favor of another, while most of the true security vendors wanted to make sure that all valid and secure use cases were allowed. The Council’s goal is not to endorse one vendor or technology over another and while the tokenization guidance document isn’t perfect it does provide the right amount of guidance for QSAs and acquirers to move forward with scoping decisions as it relates to PCI DSS and tokenization.
August 19th, 2011 at 8:09 pm
I agree with Marc – it’s important to recognize all the hard work that was put into the creation of the document which is a fair and reasonable representation of how a particular technology can be applied to help manage part of an overall risk problem – including inherent risks in tokenization itself in all its forms.
There’s no silver bullet for compliance and risk reduction – and the truth is that many players had been claiming near-utopian scope reduction in the absence of actual security proofs, hard evidence, and without any statements from the PCI SSC – creating a widening gap between expectation and practice. So now a very reasonable and fair document has been published, there should be no surprises that some prior claims don’t match the reality.
All things considered, the working groups and task forces were cordial engagements with a common goal – and despite competitive interests the outcome is a positive and useful one that the industry needed. I say thanks to all who contributed many hours of effort, and look forward to future guidance in a similar vein.
August 22nd, 2011 at 11:31 am
The original definition of tokenization was that a token was not mathematically related to the data it was protecting — the PAN. When the definition was changed in the SIG to accommodate various vendor solutions that didn’t follow this definition, that is where the security holes were introduced. Some vendors were marketing “tokenization” even though their solutions were not using true tokens and were instead using encryption and hash techniques to generate “tokens” — which in turn changed scoping aspects of the concept. To plug the security holes introduced by this definition change, the document simply offloads scoping concerns off to the acquirers and Card Brands, which will in turn refer merchants to QSA’s. In my opinion they took a simple concept that was secure, changed the definition, and then added a “use at your own risk” clause.