This is page 2 of:
Tokens Are Not The Same As Encryption. Honest
Tokens, however, are not (or at least should not be) reversible. Tokens should be randomly generated, and the token vault should be properly segmented from other systems and devices. The PCI Council (and Visa before it) stipulated that there should be no way mathematically to reverse the process and derive the PAN given only a token. This means that even if someone knew the PAN associated with one or more tokens, they would have no information to help de-tokenize the next one.
A security expert I respect greatly recently challenged me as to whether any tokenization system, particularly an internally hosted tokenization solution, could really reduce PCI scope. He compared tokenization with encryption, and he concluded that because encrypted data is considered in PCI scope if the organization can decrypt the data, then tokens, too, should be in scope if anyone in the organization can de-tokenize them.
I failed to see then (and I still fail to see) the parallel between encryption and tokenization, and I do not agree with applying the rules for encryption to tokenization. The sole fact that some people in the organization with appropriate privileges can access the token vault and retrieve a PAN does not make the tokens “reversible.” That ability does not bring tokens into PCI scope. Certainly all the persons and processes that retrieve PANs and use them for a transaction are in scope. But the tokens themselves should be considered out of scope.
The PCI Council’s guidance supports this position when it describes the special case of a “high-value” token, which may require additional safeguards.
High-value tokens are those that can be used to initiate a new card transaction. According to the PCI Council’s guidance document, such tokens “might be in scope for PCI DSS, even if they cannot directly be used to retrieve PAN or other cardholder data.” The use of the word “might” is hardly definitive, but the fact that the Council called out only these tokens for special treatment reinforces my argument that comparing tokenization and encryption is a false analogy.
Tokenization and encryption have a complex relationship. The two technologies are fundamentally different—that is, encryption is reversible, where random tokens are not. At the same time, tokenization solutions require strong encryption to protect card data stored in the token vault.
The fact that tokenization uses encryption does not justify applying the same PCI scoping considerations to each technology. For this to be the case, one of two situations has to exist. The first situation would be to have the token solution violate the rule that there is no way mathematically to reverse the token. This would validate the comparison to encryption, and I might consider such a solution to be encryption in the first place, thus negating the desired scope reduction. The only other situation is to consider all tokens to be high-value tokens, which seems inconsistent with the PCI Council’s guidance. Therefore, I have difficulty accepting a parallel between tokenization and encryption.
What do you think? Have you implemented tokenization? Did you receive the scope reduction benefits you expected? I’d like to hear your thoughts. Either leave a comment or E-mail me at firstname.lastname@example.org.