This is page 2 of:
The Unexpected Benefits of Tokenization
Another benefit from implementing tokenization is that because you have located all your PAN data, you are a step ahead in complying with the scoping requirement in PCI 2.0. One of the changes in PCI 2.0 is: “At least annually and prior to the annual assessment, the assessed entity should confirm the accuracy of their PCI DSS scope by identifying all locations and flows of cardholder data and ensuring they are included in the PCI DSS scope.”
A third benefit from the tokenization process is that it forces the enterprise to reexamine which people can access PAN data. PCI Requirement 7 tells you to restrict access to cardholder data on a strict need-to-know basis based on job function. This principle is also known as “least privileges,” because it grants only the minimum access required for a person to do his or her job.
Note that PCI—and good security practices—limits access to sensitive data based on the person’s job requirements, not his or her place on the organization chart. A particular fraud analyst or chargeback clerk, therefore, may require access to PAN data to do their job, but another person in the department or even the department head should not automatically have those same privileges.
Implementing tokenization forces the organization to start from scratch, restricting access privileges to the token vault (and, therefore, to PAN data) to those with a job-related need. Speaking as a QSA, I regularly see organizations where I have to challenge the number of people with administrative privileges that let them see (and print and copy) PAN data. In most cases, the individuals report they don’t need and never use the data. They were given access based on their department or position, not their job requirements.
A final benefit from the tokenization process is that it should stop PCI scope creep in future years. The token vault contains all the tokens and their associated PANs (encrypted, of course). This single location requires strong access controls and physical security, possibly stronger than are in place currently. The process of putting all the organization’s PCI “eggs” in one basket triggers a rethinking of logical and physical controls and restrictions on the data. These controls should stop the leaking of data to unauthorized people and systems, thereby helping to limit future expansion of the organization’s PCI scope.
I do not know that tokenization is the best approach for all merchants. Point-to-point encryption is also very promising, if it is implemented properly. Each technology has costs and benefits, and each addresses different sets of needs. The good news on the tokenization front is that the Law of Unintended Consequences (which holds that we should be ready for unexpected outcomes from any action) may, in this one case, actually work in merchants’ favor.
What do you think? Are you implementing tokenization or considering it? How did you implement it, and did you see any of the benefits I’m describing? Did you use the process to reexamine your data flows and identify process changes that could reduce your PCI scope? I’d like to hear your thoughts. Either leave a comment or E-mail me at wconway@403labs.com.