This is page 2 of:
Security Versus Scope: Choose One
The only way to get the data out of scope is if you never–and I mean never–have the ability to convert the tokens back to clear text data. At the very least, any mechanism that can reverse the token would have to reside, like the encryption keys in the case of encrypted data, with a separate “entity.” Segmenting your network is not enough to get you around this problem. If the bad guys can break into one part of your network to get the tokens, they will just as easily break into another segment to get the means to convert those tokens back to clear text data whether you call that process decryption, a look-up table or some other form of prestidigitation.
This separate entity is ideally an outside organization physically separated from your operations and systems. A hardware security module (HSM) that does the encryption or tokenization and that is totally controlled by a third party might just qualify. But if you take that route, be prepared for your QSA to examine it pretty closely and ask some detailed, technical questions.
Once the cardholder data is encrypted or tokenized, and so long as the merchant never has the ability to retrieve clear text data, all the downstream systems could be out of scope. Otherwise, all bets are off and all your cardholder data are in scope. You may have improved your security, but you have not reduced your PCI scope.
It all comes down to two separate but related objectives: security and PCI scope. Both tokenization and end-to-end encryption can increase security and lower risk when properly implemented. That is really good. But if you are going to spend the dollars, time and effort to implement either solution, why not also reduce your PCI scope while you are at it? After all, that is a lot of what you are paying for. To do that means you have to implement either technology (or maybe both?) such that you cannot ever, ever, ever get from the cyphertext back to the clear text data. It also means that you better get out your checkbook, because a purely internal solution is pretty unlikely to reduce your PCI scope.
What do you think? I’d like to hear your thoughts. Either leave a comment or E-mail me at wconway@403labs.com.
February 11th, 2010 at 1:45 pm
I agree with all your arguments and this is one of the reasons I advocate third party gateways — provided that the gateway does not allow the merchant to de-tokenize the data.
I disagree with the PCI SSC when it comes to it’s decree that encrypted data is out-of-scope if “…it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it.” Encrypted data is still data and can always be decrypted. What will the ruling be if and when someone gets hacked and it is determined that “encrypted data” was being siphoned off for months or years and then, all of a sudden some end-to-end encryption key or base derivative key (in the event of DUKPT) was cracked? Now all this out-of-scope encrypted data is as open to the hacker as plain text.
February 11th, 2010 at 6:42 pm
“Encrypted data is still data and can always be decrypted” is a meaningless statement. It like saying “the sun will eventually explode”.
Please explain how “Encrypted data is still data and can always be decrypted” can performed without a key ?
Propagating a myth that somehow its easy to brute force strong, proven cryptography is false and misleading. Attacking AES, RSA, ECC, IBE, FPE etc when strong keys are used is completely infeasible.
After all, surely even Tokenization services like Steve’s third party gateway rely on encryption and key management every nanosecond for its own security – from the connection to the banks to protect the live data in transit, the back end PAN database, the front end token API to device, to application user interfaces over SSL to manage things etc.
So, why not encrypt from the moment of swipe and be done with it – take everything after out of scope all the way to the processor? Using Format Preserving Encryption as mentioned in the article means the in between applications still work – easy.
Cheers,
Mark
February 11th, 2010 at 11:04 pm
I never meant to imply that brute force encryption key attack is easy (even though, technically it is, it just takes a long time). All encrypted data can be cracked, it’s just a matter of at what cost and how long. What is considered strong today may be insignificant tomorrow. DES (single DES) was thought to virtually uncrackable not so many years back. There are still DES devices being used and it has not been until recently that the rules were changed to phase it out. SSL1 was strong at one time — now it’s virtually useless. MD5 hashing was state of the art at one time, now PCI outlaws it.
I do see one flaw in my original statement. I should not have used the word cracked because that does imply brute force attack or encryption vulnerability. While these are possible attack vectors, I probably should have used a more generic statement like “compromised” via whatever means. Many times it’s easier to attack the key management components than it is to brute force attack for a password. My point is that if the key is ever compromised, all the freely available encrypted data is now vulnerable.
Yes, Steve’s data center does use encryption, but our data center is in full scope of PCI and we have various layers to keep the data in our data center, whereas PCI is saying a vendor can freely store, process, transmit or distribute encrypted CHD as long as the vendor does not have the key. Do you really think this is secure?
February 12th, 2010 at 4:14 pm
Even though the claim that any encryption can eventually be cracked is technically true, it’s also somewhat misleading. It’s also the case that absolutely nobody ever thought that DES was ever thought to be uncrackable. The only people who ever argued that this was true were people from the US government who wanted to control the export and use of encryption outside the US, and nobody really believed them.
Let’s get a rough idea of exactly how secure 112- and 128-bit keys are by estimating the level of effort needed to crack one. Let’s base this on the EFF’s DES Cracker.
The DES Cracker can test roughly 92 billion keys per second on 1,536 special-purpose chips. Given a plaintext-ciphertext pair, it can test all possible DES keys in a bit over 9 days, and you’d expect it to find the key that decrypted the ciphertext in about half that time, or about 4 and a half days.
Let’s assume that we can make a special-purpose computer that can test keys one billion times faster than the DES Cracker, or roughly 92 quintillion keys per second. This isn’t even close to realistic with today’s technology, but maybe a combination of Moore’s law, faster clock speeds and adding lots of additional chips to the computer will let us do this one day.
Even with this huge increase in power, such a hypothetical machine will still take roughly one million years to recover a single Triple-DES key, or about 100 billion years to recover a single 128-bit AES key. So unless someone finds an incredibly severe weakness in the encryption algorithms that we use today, it looks like that assuming that they’re essentially unbreakable is fairly reasonable. No other component of a security architecture even comes close to being this strong.
So for all intents and purposes, a hacker is never going to attack the cryptography. He’s always going to bypass it. That’s even Shamir’s Third Law, which he discussed in his Turing Award lecture in 2002.
But if 112- or 128-bit encryption is essentially unbeatable, key management certainly isn’t. You need to authenticate a user or application that’s requesting a key, for example, and that process will never be as strong as what you get from the actual encryption. Note that this also applies to tokenization: you need to authenticate a user or application that requests a detokenization operation, etc. So if you worry about key management (like you should), you should also worry about the corresponding issues around the way that tokienization is implemented.
So the security provided by either encryption or tokenization are essentially the same: they’re both limited by system issues that don’t really relate to the encryption or tokenization technology. People seem to either not understand that or to cheerfully ignore it.