This is page 2 of:
If Your Token Vendor Goes Bankrupt, What Happens To Your Data?
Your goal is to determine how tokenization will reduce your PCI scope and, thereby, the total cost to achieve and maintain PCI compliance. As I pointed out last week, I won’t pretend to have all the answers. However, this set of 10 questions should get you headed in the right direction—or at least help you avoid an expensive disappointment.
Before you commit to any approach—whether it is software, a hardware appliance or hosted by a third party—you should know if it requires any upgrades or application changes.
That is, will you need to upgrade your POS terminals or payment application to fit with the tokenization application (my third question)? Does the tokenization application or appliance have compatible interfaces (APIs)? On the other side of the process, will your downstream applications be able to use the actual tokens as generated (my second question), or will they need to be rewritten? For example, if a marketing application requires a token that complies with the Luhn algorithm (“mod 10”) checksum test, make sure you generate that type of token or plan to update the application.
Hopefully, the answer to this question is “no.” But you should check to be sure. Many payment applications store PAN data, if only briefly. For PCI purposes, whether you store cardholder data for a millisecond or a year, you are storing cardholder data, and even with tokenization your payment application may remain in scope.
Retailers should work with their application vendor (or internal developers) to reconfigure or upgrade the payment application so it processes or transmits the cardholder data but does not store the data. It would be a shame to spend the time and effort to implement tokenization and find out that your payment systems are still in scope.
That is, will the provider accept responsibility for the security of your cardholder data in their control? I have written (some say ranted) about this issue before (see here and here), so I won’t repeat myself. The point is, knowing your tokenization provider is PCI compliant is a nice start. But by itself that is not enough to meet this requirement.