This is page 2 of:
Data Security Slugfest: Tokenization Vs End-to-End Encryption
The value proposition behind end-to-end encryption is that a company encrypts the data immediately at the entry point (the POS, the E-Commerce payment software and the call center software) and then the data remains encrypted throughout the process of passing it to the acquirer, and PAN data is never stored unencrypted by the merchant. Several products are available for the POS channel and numerous products are available for the E-commerce channel, but any of these products can be rendered ineffective if a company insists on storing and using the data for other applications.
The other key point with end-to-end is that some companies are focused on an “enterprise view” of end to end, rather than defining one of the end points as the acquirer. In addition, the policies for and the processing of chargebacks in some companies tends to mess up the end-to-end scenario. The main thing to remember regarding encryption is that it is but one of 12 major PCI-mandated controls, even when it is end-to-end. I think the PCI study will likely conclude that other controls are still needed even if encryption is end-to-end.
Speaking of things that are no longer needed, there is a lot of discussion in the industry that all a company has to do is implement tokenization and they will no longer need to encrypt data. In an ideal world, that may well be true. However, in “beyond PCI” land, we haven’t found any Level 1 or Level 2 merchant that was able to actually get rid of all their card data even though they implemented tokenization. Some of the reasons include business requirements, the cost to change their production applications, and the difficulty of actually finding and purging all their card data.
On the other hand, some smaller organizations that are single-channel and have a highly centralized data architecture, those smaller businesses have been most successful at handing off the data and compliance headaches to tokenization companies. Our point is simple: end-to-end encryption and tokenization will likely exist side-by-side in nearly all large and most midsize businesses for the next 2-3 years, so suggesting that one can take the place of the other for such organizations does not take into account the reality of the large, multi-channel merchant or service provider.
Althoughy I like to see a fight to the death as much as the next guy, I think the winning strategy for 2009 and 2010 will require being able to demonstrate how tokenization and end-to-end encryption can co-exist within large and midsize enterprises, rather than for purveyors of one control to suggest the other control is irrelevant or unnecessary. The whole point of PCI DSS is “defense in depth” and in that spirit, we counsel “peaceful coexistence,” at least for now. If you would care comment, and are willing forego weapons of mass destruction in favor of constructive dialog, please visit us at the PCI Knowledge Base or send an E-mail to David.Taylor@KnowPCI.com.
April 16th, 2009 at 8:10 am
What are you thoughts about efforts that some retailers are taking to get their POS systems out of PCI scope altogether by moving card processing off to a seperate device (that only communicates the most basic transaction information – not card data – back to POS). Regis Salon has implemented this. We are loking at it very seriously. Seems like McDonalds is doing the same. For smaller speciality stores, this appears to be a long term cost saving option and would make maintaining compliance much easier. I’m interested in your thoughts.
April 16th, 2009 at 5:40 pm
With respect to “The result is the dreaded “encrypt, decrypt, re-encrypt†scenario which opens up holes to external hackers and unauthorized insiders”
Any legacy approach will suffer this exact problem – but that is now a completely solved problem. As I’ve mentioned here before, an entirely practical solution here is FFSEM Mode AES aka FPE, or Format Preserving Encryption (see it on the NIST AES modes site if you want to learn more about the method). FPE can do this:
• Encrypt 12/15/16 digit credit card numbers to a 12/15/16 Digit credit card numbers that’s useless to anyone except the trusted entities but still works in POS systems, or,
• Encrypt first 12 say only, encrypt internal account # digits, leave routing/BIN etc in clear, maintain Luhn etc. whatever combination is needed to suit the business purpose. Separate sub fields can be independently encrypted so different systems can use them independently – separating access right down at a sub field level – think firewalls around the data itself.
• FPE can be applied to not only cards, but SSN and other data- names, address, customer field notes – whatever.
• FPE can encrypt columns that are indexes or have primary and foreign key relationships – it does not matter- referential integrity is preserved
• Any data encrypted with FPE can also be used immediately in a QA system as masked QA/Test Data – no need to regenerate DB test data.
• Use IBE (Identity Based Encryption, RFC 5091) and FPE for Stateless Key Management – solve the complexity of key management to almost no management overhead and one way encryption at capture.
Therefore, with respect to the lifecycle of the data from capture storage, there is no need to open up the data -the data can be used in its encrypted form – as is, no decrypt, no risk, no keys present, no PCI scope.
Tokenization systems on the other hand simply shift the problem and put the burden of managing a massive amount of additional state information (token/card mapping data) to the operator of the token scheme – hence the often quoted transaction or service subscription costs. Tokenization approaches essentially create a parallel universe of cardholder data to manage. This has obvious scale limits and hence why tokens must expire – and hence destroy any business intelligence processes a business may require when operating e.g. a data warehouse, data mining etc. Tokenization systems also require a call to the token generator per transaction, thus must be online at all times. This results in incremental overheads per transaction, don’t solve offline processing, and creates addition network hops and attack vectors as the clear data is now traversing a potentially large path to the token provider. The tokenization vendors seem to forget to mention that encryption is required – “end to end encryption” ideally – from the capture point to the token generator, and vice versa at the point where the data is required in clear form for eventual clearing.
FPE solves that problem in situ and stops those who deploy it from having to worry about that parallel universe of card data.
Regards,
Mark
VP Product Management
Voltage Security
Full disclosure: I work for a firm who creates products for privacy and payments systems using FPE technology.
April 20th, 2009 at 6:11 pm
Mark, You are incorrect in your blanket statement – Shift4 is a tokenization vendor and we do detail this. We always mention that end-to-end encryption is required and we provide the means as well that offloads the key management and in most cases is transparent to the POS – state of the art or legacy. The biggest difference in FPE vs. tokenization is that encrypted card holder data is still card holder data whereas a token is not. You’ll get different opinions among QSA’s as to whether or not FPE removes applications from PCI scope whereas most QSA’s agree that our end-to-end encryption/tokenization model does (just for clarification: can remove applications from scope, not merchants).
Both solutions have their pro’s and con’s. I give more pro’s to a properly implemented tokenization solution that includes end-to-end encryption (bypassing the POS). I’m guessing that you give more pro’s to FPE. To me, either solution is much more secure than the POS receiving unencrypted card holder data and trusting the POS provider to properly secure this data within the application and the merchant’s network, and with adequate key management controls.
In the not so distant future, the ultimate solution might be the combining of technologies.
Steve Sommers
Sr. VP Applications Development
Shift4 Corporation
April 23rd, 2009 at 11:59 pm
A healthy debate going on – but both scenarios require attention to the configuration and management of either the encryption or tokenization solution. Doing anyone of these tactics still requires additional PCI controls to ensure compliance – however many are positioning these investments as the silver bullet. As David refers to in his last point these will be another means to defend this data and mitigate risk. But my concern is today’s and tomorrows systems (here and now) – how fast and at what cost can the individual Merchant of any size and with limited spend go beyond PCI compliance. Runtime control with dynamic whitelisting and memory protection is another means to provide for secure systems today without code or database changes. The technology whitelists known good systems taking inventory of all executable code (.exe, scripts, java, and .dll’s) and of the system and application configuration, identifies authorized processes that are pre-trusted to provide automatic updates to the system then denies all other types of changes. In addition currently running applications and local data, registry entries and configuration files can be read/write protected so they are only usable and visible to their native applications or those that require them for processing. Easy implementation, no application or change of current business delivered with a single solution that goes beyond PCI compliance but that also fulfills many of the current PCI requirements like protecting from malware today and future malware and file integrity monitoring. This method is accessible to all Merchants today and many may even find their POS vendor reselling or embedding the capability into their systems providing PCI-ready devices.
Kim Singletary
Director, OEM & Compliance Solutions
Solidcore.com
November 5th, 2009 at 11:17 am
The solution to the encryption conundrum is simple: define and stand by a standard. Unfortunately, there are too many hands in the encryption cookie jar for this to ever happen. Some fear that a single standard would be a bad thing, because if it were ever hacked, the losses could be staggering. Still, with a single standard comes increased strength and progress made in methods, along with more streamlined updates and implementation. Th result would be a much smaller chance for a successful attack, rather than a larger chance.