advertisement
advertisement

Opposition To Tokenization A Lot More Than Token

Written by Evan Schuman
May 9th, 2008

Guest Columnist David Taylor is Research Director of the PCI Alliance, Founder of the PCI Knowledge Base, and a former E-Commerce and Security analyst with Gartner.

There’s more than token opposition to tokenization.

That’s arguably one of the top conclusions to come out of 100 hours of interviews with merchants, banks, PCI assessors and card processors for the PCI Knowledge Base.

One of the concerns is that companies have already spent money on encryption. The most popular reason for not implementing tokenization is that companies have already implemented data encryption and key management systems costing hundreds of thousands of dollars, and either they did not feel they needed tokenization or they were unwilling to be perceived by upper management as "changing course" by recommending the removal of the data they just spent all this money to protect.

Applications managers won’t give up the data. A near rival for the top reason for resisting tokenization is that business managers and application owners use card numbers in many different places in their business processes and applications. In addition, the security managers, who typically prefer tokenization (as it reduces their own risks), do not believe they can successfully argue that the applications could be rewritten to work with the token numbers instead. They feel that the costs for changing the application code cannot be justified by the level of risk reduction.

Merchants are waiting for their bank or database vendor. Some of the merchants said they would be willing to consider tokenization, but not from the current crop of smaller, independent vendors. Some said they felt such solutions would soon be offered by their own bank or card processor, others (typically in IT) said they wanted to wait until tokenization is an option built into their DB management software.

Tokenization is too new or unproven. Some of the merchants who resist using token numbers as substitutes for card data are simply objecting to the fact that there are not enough reference accounts who are willing to talk about their experiences. Very few companies want to be first to take what they perceive as an additional risk relative to their credit-card data, so they want to be assured that their peers are involved. The fact that this becomes a self-fulfilling prophecy is clearly lost on these merchants.

The tokenization vendor is a "single point of failure." Some of the merchants and PCI assessors we interviewed expressed concern that by having the card data from hundreds, even thousands of companies concentrated in "one place" (a tokenization vendor’s systems), it could make the vendor such an attractive target (like the Department of Defense or National Security Agency) that so many talented crackers would be pointed at the repository. With that, they reason, "someone" would break down the defenses. This treasure trove of data would be equally attractive to privileged insiders, thus making a detailed review of any tokenization vendor’s solution absolutely mandatory.

Tokenization pricing models are immature and too variable. We spoke with a few merchants who had done head-to-head comparisons among the major tokenization vendors, and they encountered highly "flexible" pricing models. A larger concern was that the merchants had no idea how to tell if they were getting a good deal, as the pricing models were difficult to compare across vendors.

Bottom Line: Despite how attractive tokenization sounds as a concept, there is substantial resistance to the products and services as they exist in the marketplace today that is sufficient to limit the growth of this market in the next one to two years.

If you want to discuss this column or any other security or compliance issues, please send me an E-mail at David.Taylor@KnowPCI.com or visit www.KnowPCI.com and click "Add Your Knowledge" to join the PCI Knowledge Base.


advertisement

8 Comments | Read Opposition To Tokenization A Lot More Than Token

  1. Steve Sommers Says:

    I read with interest your article “Opposition to Tokenization a Lot More Than Token.” You made some interesting points, all valid but all can also be countered. I’ll address them in order:

    1. Too much money already spent on encryption.

    This is very true, too much money has already been spend on encrypting card holder data (CHD). There is a common saying here in Vegas, “don’t throw good money after bad.” While this advice is for gamblers, I also think it applies here. Yes, there are many managers and directors that fear their reputation and job is on the line if they now switch to endorse a less expensive, and many times a significantly less expensive, alternative. But there are other factors to consider, risk being a big one. Under PCI and many of the various privacy laws, loss of an encrypted file that contains CHD, even without the decryption keys, is still considered a breach that must be reported because the data has the potential to be cracked and exposed. Tokens are not considered CHD so the same scenario would not be considered a breach and would not have to be reported.

    The weakest point with encryption is key management and this is an ongoing cost. Keeping keys safe turns into a catch-22 situation and many companies do not dedicate sufficient resources to this problem. Encrypted data needs a key. This key must be protected. The obvious solution is to encrypt the key. But now this new key needs to be protected. And so on, and so on — Key management solutions can help but in the end, somewhere the same weakness arises — and these key management solutions can be expensive. Throw in PCI 3.6.4 which requires annual key changes and rekeying (re-encryption) of the data, and the costs go higher.

    2. Applications managers won’t give up the data…use card number in many different places in their processes and application.

    Again, very true; but at what cost? If you look back at some of the latest breaches, CHD was not exposed from the POS. Ironically, it was exposed from risk management applications or the data in transit to these risk management applications. Tokenization can be used in these other processes and applications and would address both the data in transit problem as well as the data storage problem. I would argue that tokenization, or at a minimum, proper hashing should be required for these ancillary processes and applications and real CHD should not be allowed.

    3. Waiting for bank or db vendor…lack of confidence with current tokenization vendors.

    Waiting on you bank to offer this may be a long wait and if and when this does happen, the downside is that you would forever be locked into that bank. Having the database vendor do tokenization for you would not help ease your security burden — PCI wise, the database would be housing the CHD and to get the real benefit of tokenization, it needs to be hosted by a third party or at least a remote application.

    There are larger vendors offering tokenization. To me, this argument is more of a smoke screen hiding the fact that the merchant simply does not want to change. Depending on the application, tokenization may impact the merchant’s processes and this fear of change translates to lack of confidence.

    Whether or not my opinion on this is correct, this issue must be addressed and overcome. When we are speaking directly to a merchant, this issue can be overcome by showing all the advantages these changes will bring (cost and risk benefits) and giving references (in case our fear of change assumption is incorrect, and it really is a confidence issue).

    4. Tokenization is too new or unproven.

    This is a hard one to argue against because it’s hard to prove a negative. Have any breaches occurred where tokenization was used? Actually, let me rephrase that, has any token ever been used to expose data? I re-phrased my question because technically, with most tokenization implementations, there is an exposure point prior to the token being issued. While I re-phrased my question by narrowing the focus, but to date, no card data exposure has occurred to tokenized solutions using either question.

    Since negatives are hard to prove, I get around this argument, simply detailing tokenization and how it works to secure data. It usually does not take much to show someone how a completely random sequence of data is more secure than card numbers, encrypted or not. Add details on the difficulties of encryption key management and this argument usually dissolves.

    5. The tokenization vendor is a “single point of failure”…single point of attack.

    As to the single point of failure, I can only speak for Shift4 and simply point to our track record. This is a valid concern ;evaluating vendors, uptime, resiliency, redundancy, as well as other factors must be part of a merchant’s criteria.

    As to the single point of attack, true but I would argue which is safer: using a single vendor that specializes in payments and security that is scanned and audited regularly; or having the data housed by thousands of merchant locations of varying degrees security — from none to relatively strong? For the most part, the later describes the current environment and breaches are making headlines almost daily — not to mention all the smaller breaches that don’t make the headlines.

    6. Tokenization pricing models are immature and too variable.

    Again, speaking only for Shift4. Our tokenization comes free with our services — no up’s, no extra’s, no increase when we added tokenization. In fact, we released tokenization to the public domain back in 2005 because we felt that this technology, if properly implemented, is such a benefit to the security of CHD that it should be freely available to everyone.

    In conclusion, you make very good points and all points that we (Shift4) have encountered ourselves. I just wanted to point out that all these obstacles can be overcome with the proper information and education.

    Steven M. Sommers
    Vice President Applications Development
    Shift4 Corporation — http://www.shift4.com

  2. David Taylor Says:

    Steven,
    Thanks a lot for the comments — i believe they are longer than the original column, which appears to be a new record.
    The tone of the comments is exactly right. When I was doing the interviews for the http://www.PCIKnowledgeBase.com, these are the types of resistance I encountered as I was saying positive things about tokenization, along the lines of what I said in my Feb 8th column.

    In order to move the market forward, we have to be able to address each of these objections factually.
    The PCI Knowledge Base was founded on the principle of the free exchange of knowledge and experience.

    Storefront Backtalk promotes these same ideas, and I’m encouraged to take a position, even a controversial one, in order to promote discussion. I only wish more folks would take the time to present the issues and the facts as you have done.
    Thanks, Dave Taylor
    Founder, PCI Knowledge Base
    David.Taylor@KnowPCI.com

  3. Steve Sommers Says:

    Dave,

    I knew that was what you were doing and your points are issues and concerns that we have already encountered. All the points are valid concerns for merchants, as they should be. I was just conveying how we counter the opposition. Keep doing what you do. I would rather have someone like you point out the various issues that can arise and address them than convey a grass is always green message. I equate the later to a sales brochure.

  4. A Reader Says:

    As I’ve said before, tokens are much less secure than properly implemented public key based encryption.

    Assuming the tokens are generated based on the account number, such as with a cryptographic hash, then the tokens are subject to a simple dictionary attack. If an attacker can freely access the tokenization routine, all the attacker has to do is feed every possible account number into the tokenizer until a match pops out. The attacker does not have to know the technical details of the hash routine (SHA-1 vs SHA-256 or MD5), all they need is access to it.

    [ I’ve personally written such an attack against a tokenizer (that was SHA-1 based) and run it from my own desktop PC, and I recover whole account numbers in an average of four seconds. (I fixed a bug that previously kept it spinning for up to 40 seconds.) So I know first-hand that account numbers can be recovered from tokens. ]

    So instead of protecting secret keys, you now have to protect this secret tokenization algorithm, in every single place it exists — registers, PIN pads, kiosks, web servers, etc. If you were unaware of this attack, you might not even know that you should be protecting it.

    On the other hand, public key encryption is extremely secure, and key secrecy in the field is a non-issue. Public keys are intended to be distributed publicly — that’s the point. It’s the private keys that must be carefully held, but since they are used only at the decryption point they can be securely stored in a single, hardened, dedicated hardware decryption appliance (such as an Atalla box or an IBM mainframe with a cryptographic coprocessor.)

    Sure, there are still points to harden in the stores, and a PCI audit is still a useful tool. File Integrity Monitoring can help insure that encryption routines are not tampered with, and that rogue software isn’t poking about where it shouldn’t. Constant patching of the operating system is needed to ensure that the encryption routines remain secure (the cryptographic PRNG is a critical component for security in a public-key protocol.)

    That said, encryption is very easy to get wrong, and neither encryption algorithm design nor encryption protocol design should be left in the hands of amateur cryptographers. Any security system like this should be subjected to a rigorous review by several unbiased professional cryptanalysts, and should be based on sound designs with long track records in the cryptographic field.

    I don’t mean to say that tokenization is not “better than nothing”. Obviously, recovery of the account numbers from tokens requires technical knowledge of the systems, and certain levels of access. But an insider could certainly figure this out, and without adequate monitoring and other mitigating controls they could be recovering accounts from tokens today.

  5. Steve Sommers Says:

    I agree that properly implemented public key encryption or PKI can be one of the best forms of encryption for card holder data (CHD) but there are two issues with this. To get the most benefit out of public key encryption in a point-of-sale (POS) environment, the POS cannot have access to the private key portion of the two keys. The problem is that many payment transaction requests to the banks and processors require two steps: an authorization step and a later settlement step. Both these steps require the CHD and therefore the POS would need access to the private key decrypt the CHD to perform the second step. This negates the biggest advantage of PKI. The second issue is that PKI is probably the most expensive form of encryption to properly implement. By expensive, I’m not referring to licensing costs because there is free code and libraries available. Instead I’m referring to the infrastructure changes required to support the physical and logical separation of roles between the system that house the public and private keys, database changes required to support the storing the encrypted data, all the access controls required for the keys, the annual rekeying of the data if it is archived, PKI is much more CPU intensive than single key encryption techniques (Blowfish, AES, 3DES, etc.), etc. etc. Many of these costs are not PKI specific, but instead are costs associated with housing encrypted CHD.

    Now your whole argument is assuming the “token” = “hash” or “encrypted CHD” and that a hacker, if provided the tokenization algorithm, could de-hash (if that is a word) or decrypt the data. Being the inventor of the term “tokenization,” the definition you are using is incorrect. Shift4 invented the term tokenization, not the concept. The concept has been around long before PCI, CHD or even computers. A token is simply an object or in this case, a piece of data that symbolizes or is used to reference another piece of data — the CHD. A properly implemented token is not related in any way to the original data other than by reference. In law enforcement, case numbers are assigned to cases; most of the time these numbers are simply sequential numbers. The case number itself is a token. There is no way to decrypt the case number to determine the contents of the case. This is why I say we invented the term, not the concept.

    Now there are vendors out there that are applying the same definition you did to the term tokenization. Against these implementations, your argument is very valid. But in our minds, these are not tokenization, or at least not properly implemented tokenization solutions — these we classify as hashing or encryption solutions.

    Lastly leave you with a challenge. Below are test card numbers and their associated tokens in my testing database. I urge you or anyone to devise an attack on the tokens that reveal the corresponding card number:

    AX 373400000000001 0001d7byvlgqmdzf
    AX 373400000000001 0001713nfgjb20yt
    AX 373400000000001 0001j4nr6pjb2js3
    MC 5400000000000005 0005sf9fmmjb2yr3
    MC 5400000000000005 0005hw29x2jb2j9z
    MC 5400000000000005 0005ylmsg9jb2xnt
    VS 4222222222222 2222dxdh61lvq68l
    VS 4222222222222 2222f9nk2llvq92x
    VS 4222222222222 2222y9wswygqm278

    If you can crack this, I’ll buy your algorithm because you would have figured out a way to predict both future and past random events — the implications of which would be mind blowing (not to mention a great money maker ;-).

    –Steve

  6. A Reader Says:

    First, I’d like to thank you for taking the time to respond. Without any knowledge of how a system or protocol works I always assume the worst (it doesn’t pay to be kind in security), and it’s good to see that yours appears to be carefully designed, and is not merely a simple hash.

    I’d like to rebut a few of your arguments against public key encryption, and then do a little analysis based on the data in your challenge.

    In a PKI environment, the POS store systems absolutely do not need the private decryption key to reside in the terminal or store. Decryption for authorization and settlement purposes occur only at the headquarters location in a single secured environment.

    Archived data should never need to be rekeyed (a dangerously unnecessary operation) unless there is a true breach. As a matter of fact, archived data is quite effectively end-of-lifed by destroying the private key held in the decryption engine, and in all backups of the decryption engine.

    Absolutely there are costs associated with securing the private key. But I also assume there are comparable costs associated with securing access to the tokenizer, and especially to validating and authorizing the retrieval of the account numbers when it comes time to authorize and settle the account. I assume there would be little functional difference from the way it would be accomplished in a private key decryption operation.

    Now, on to your challenge (the fun part! :-)

    Assuming that what I see is a list of unique tokens that each represent the same account number, then your tokenizer(TM) is doing a good thing — you are issuing unique tokens every time data comes in. You are preventing the attack I described, because you’ve not based your solution on a hash. I am gratified to see that.

    My attack is based assuming a store local token generator, one that repeatably returns the same token for the same account number (i.e. a hash.) So, you pass that security test.

    A little analysis of your tokens’ structure reveals you are fitting them into a 16 character field. Assuming the first four characters represent the last four digits (a good idea), you are filling the remaining 12 characters with a 36 unique value character set (0-9 and a-z), which means you have space for unique numbers up to 36^12, or 18 numeric digits. Not much room in there for a salted hash, but there is room for encryption of up to a 19 digit PAN (omitting the Luhn check digit). By the way, I applaud your choice to exclude case, as case sensitivity requirements make humans much more error prone.

    I will note that without a locally available token generator, you cannot generate tokens when offline to your tokenizer, and therefore you must take some other action to protect the data in those circumstances. I do not have evidence to suggest how that might work in this environment. (As a plug, public key cryptography solves that problem nicely.)

    But how does the client obtain the token, and still have a recoverable account number for settlement? You must pass the real account number into the tokenizer at some point. Therefore I assume your tokenizer is a service operated at a central location, fronting a token generating database, and that the local tokenizing routine run at the terminals is a simple proxy to this service.

    And therefore, you must be transmitting the actual account number over the wire to the tokenizer — just as you would transmit the account number to perform an authorization. You’ve reduced the transmission of the account number from two times (auth and settlement) to one time (tokenization). But it’s still transmitted in a recoverable state to the central tokenizer.

    Therefore, your security is equal to that of your line security. I assume that you transmit over an encrypted line. (If you’re transmitting the account number in clear text to the tokenizer, that would be an Epic Failure to protect the data.)

    I’m basing the rest of this analysis on the assumption that you do not send the account number in the clear to the tokenizer.

    Your security now rests on the strength of the transmission encryption OR (not and) the strength of the security surrounding the token database. If either point is successfully attacked, account numbers can be recovered.

    Regarding transmission, there are three general ways you can protect the data: you can use TLS (by which I mean SSL or any flavor of public key encryption) with certificates, TLS without certificates, or secret key encryption.

    Since you’ve expressed concern over both certificate handling and the performance of public key cryptography, it makes me suspect you’re using secret key encryption to perform the transmission of the account number. And if you are using secret key encryption, the secret key must be available to (or embedded in) the token-sending-proxy. That can be recovered by an attacker who gains access to the machine or disk image.

    If you are using TLS without certificates, your tokenization is subject to a man-in-the-middle interception attack. It’s not particularly easy for a layman to do, but it’s certainly not difficult. And it can be done without access to the POS terminal at all.

    Lastly, if you’re using TLS with certificates, congratulations, you’re using PKI.

    So in the final analysis, it appears to me that your security rests on encryption technology. Yes, it’s transient; yes, it’s a one-time shot; and yes, the data storage is as secure as your token database. But your solution can not escape the reality of securing the data in flight any more than any other encryption solution. Plus it introduces the still unknown (unpublished?) security of your token database.

    Now, if I’ve made any incorrect assumptions, please let me know. Without seeing the architecture and the interfaces to each system, I can only make educated guesses, but I could certainly be wrong at many, many points in the above analysis.

    Lest you leave think I’m entirely negative, there are other factors working in your favor, and I’d like to reiterate those for anybody else still reading at this point. The most important is that your solution appears to be technically “available” to a typical retailer looking to purchase a solution. As this discussion illustrates, engineering a secure solution is not easy, and therefore it is expensive.

    I also assume that you’ve done the “hard work” of securing access to your centralized tokenizing equipment, providing retailers with not only the hardware but a simple to follow script of “type this here, put this smart card here, type this there, and now you’re secure”. Simple instructions that lead to 99% protection are much more effective in the real world at securing most data than a theoretically secure solution that requires an army of professional cryptographers to implement.

  7. Steve Sommers Says:

    Wow, I rarely come across posters that can keep up with me in text length and put as much thought as you have in what they type — want a job? I’ll try my best to summarize your points prior to my response so other readers don’t have to jump back and forth (too much).

    WARNING: Much of the detail that follows describes the Shift4 implementation of tokenization. Because of this, it may come across as a sales pitch. This is not my intent; it just happens to be the implementation that I am most familiar with and the only one I can speak on with any authority.

    PKI decryption/centrally located secure environment at headquarters — I was not assuming that you were talking about this level of a merchant. I usually gear my arguments for all merchants but especially level 4 merchants because they are the ones feeling the most pain. Many merchant that we target do not have the resources or the expertise to create the secure centrally secured CHD environment so I was assuming that all the work was being done at the POS. We released the tokenization concept to the public domain so merchants and others could do there own HQ type payment environment but for many merchants, outsourcing this central system is much less expensive.

    Archive/rekeying — Destroying the key is a compliant and effective way to force an end-of-life on the data. But PCI 3.6.4 requires annual rekeying of the data as long as it still lives or is accessible. In our system it is 2 years. I’ve heard some merchants require up to 7 years. IMHO, this rekeying requirement in PCI adds more security problems than it solves. I understand what they are trying to address but there are much better ways to accomplish the same goal — but that’s another topic. But since this requirement exists, it must be addressed either by shortening the life or the data to under 12 months, rekeying the data or some other compensating control. Tokens (at least outsourced tokens) do not have this issue.

    Tokenization costs vs. PKI costs — When I refer to costs, I am talking about merchant using our outsourced gateway solution. Assuming that there is a PKI outsourced gateway equivalent (which I’m not aware of), the cost difference is in the modifications required for the POS to handle the token or PKI encrypted data. As you noted further down, our token is alphanumeric and fits within 16 bytes, which just happens to be the length of most existing card numbers and the same length most POS applications have reserved to store CHD. On the other hand with PKI, I’m not sure if it is possible to fit the encrypted data in the same POS database size limitations.

    Token format? — Our token is comprised of the last four digit of the card number followed by twelve bytes of random alpha data. In reality, the “random” data boils down to nothing more than a sequential number run through a big prime number calculation simply to produce pseudo-random looking results. Security wise, 2222idkdhjgeesqm followed by 2222japposidmdss is no more or less secure than if we returned 2222aaaaaaaaaaaa followed by 2222aaaaaaaaaaab. Like in my analogy, we think of a token as a case number that references CHD. The pseudo-random looking token is used to impress and make it look challenging to decipher. What I described here is our implementation of token assignment. Our public domain release of tokenization did not describe this level so other merchant and vendor implementations will differ.

    Local token generation — Very good catch; many don’t catch this! During normal online conditions, our data center assigns the tokens and these are passed back to the POS. During offline conditions, our UTG (which I describe as a proprietary VPN end-point on steroids) has the ability to generate local tokens and store these tokens with the associated CHD in an offline database using PKI. Once online conditions are restored, the offline file is transmitted to our data center and the local file is deleted. The local offline storage file is using PKI but it’s transparent to the merchant and the POS (provided the POS already supports our API and tokenization).

    Obtain token for settlement — With our API, the CHD is never returned back to the POS. Instead, the POS should use the token as the CHD. When the POS performs the settlement, they send us the token. Behind the scenes, we access the original CHD using the token and send it off to the bank or processor. When you design or compare tokenization solutions, this is a key point to consider. INHO, CHD should never be returned to the POS or any application you are trying to protect using tokenization. Doing so simply opens up a whole that can negate all the benefits of tokenization.

    Encryption and Data pipe security (TLS) — As a rule of thumb, most all our encryption pipes use a hybrid model where we use PKI for the initial handshake and then a shared secret using a DUKPT like model (derived unique key per transaction). The PKI handshake is used to pass a dynamically assigned random key page for the DUKPK session.

    Man-in-the-middle attack — We use various levels of authentication to prevent this which includes our own certificate signing, locking down access to specific IP addresses, locking down access to specific mac addresses, and more stuff that is beyond my expertise.

    To everyone, again, I’m sorry if this sounds like a sales pitch; it’s not the intent. I’m arguing the advantages of tokenization vs. a PKI only solution for handling CHD. At this level the only way I can think to convey the advantages is to give details and for this I have to stick to what I know most intimately. Hopefully you can equate how Shift4 accomplishes tokenization to how to properly implement tokenization with or without Shift4 in the mix (obviously I’m hoping in the mix ;-).

  8. A Reader Says:

    Again, I appreciate your reply, and I’d like to say I’m impressed by your solution.

    Regarding 3.6.4, it certainly does NOT say that you must re-encrypt the data. (We all know that’s a security loophole a mile wide.) It says “Periodic changing of keys — at least annually”, which means you must change *encryption* keys at least annually. By cycling a different public key through on a periodic basis, a PKI system is compliant. The encrypted data can remain safely encrypted with the old key for its lifetime. Merchants are merely prohibited from adding new data with the old key after one year. Of course, if you were encrypting the data with a secret key algorithm such as AES and used only one key for all encryptions and decryptions without benefit of a key identifier, you would be unable to change the encryption key without re-encrypting the data itself. That way lies madness.

    And regarding encrypted CHD size, you’re correct that a PKI solution produces a larger data block — much larger, in fact. A two-phased encryption, such as PGP uses, requires the generation of a random session key used in a traditional symmetric key encryption of the data like 3DES or AES, plus the encryption of the session key via DH, RSA or other public key algorithm. The encrypted data can be encoded to fit in its original size, but the encrypted session key will be at least 1024 bits (128 bytes) long, and you’d be wise to use a 2048 bit (256 byte) key. In order to decrypt it, you’ll want to add a public key identifier so you can find the right decryption key. It easily amounts to over 300 bytes per account number for a very simple solution. And you’re absolutely correct in assuming that does not fit into existing retailer databases very well.

    I’m glad you took the time to describe Shift4 in greater detail. As I mentioned before, without disclosure of the mechanisms there is no way to trust a simplistic description, and in the vacuum of facts I had made some incorrect assumptions. I feel much more comfortable about the Shift4 algorithm now.

    As you’ve already figured out, I’m associated with a firm larger than a level 4 merchant, so I’m not looking for a job right now :-) We have already implemented a compliant PKI-based solution, so we’re not shopping for a different tokenizing system at this time. But I’ll keep this in the back of my mind. Thanks again!

Newsletters

StorefrontBacktalk delivers the latest retail technology news & analysis. Join more than 60,000 retail IT leaders who subscribe to our free weekly email. Sign up today!
advertisement

Most Recent Comments

Why Did Gonzales Hackers Like European Cards So Much Better?

I am still unclear about the core point here-- why higher value of European cards. Supply and demand, yes, makes sense. But the fact that the cards were chip and pin (EMV) should make them less valuable because that demonstrably reduces the ability to use them fraudulently. Did the author mean that the chip and pin cards could be used in a country where EMV is not implemented--the US--and this mis-match make it easier to us them since the issuing banks may not have as robust anti-fraud controls as non-EMV banks because they assumed EMV would do the fraud prevention for them Read more...
Two possible reasons that I can think of and have seen in the past - 1) Cards issued by European banks when used online cross border don't usually support AVS checks. So, when a European card is used with a billing address that's in the US, an ecom merchant wouldn't necessarily know that the shipping zip code doesn't match the billing code. 2) Also, in offline chip countries the card determines whether or not a transaction is approved, not the issuer. In my experience, European issuers haven't developed the same checks on authorization requests as US issuers. So, these cards might be more valuable because they are more likely to get approved. Read more...
A smart card slot in terminals doesn't mean there is a reader or that the reader is activated. Then, activated reader or not, the U.S. processors don't have apps certified or ready to load into those terminals to accept and process smart card transactions just yet. Don't get your card(t) before the terminal (horse). Read more...
The marketplace does speak. More fraud capacity translates to higher value for the stolen data. Because nearly 100% of all US transactions are authorized online in real time, we have less fraud regardless of whether the card is Magstripe only or chip and PIn. Hence, $10 prices for US cards vs $25 for the European counterparts. Read more...
@David True. The European cards have both an EMV chip AND a mag stripe. Europeans may generally use the chip for their transactions, but the insecure stripe remains vulnerable to skimming, whether it be from a false front on an ATM or a dishonest waiter with a handheld skimmer. If their stripe is skimmed, the track data can still be cloned and used fraudulently in the United States. If European banks only detect fraud from 9-5 GMT, that might explain why American criminals prefer them over American bank issued cards, who have fraud detection in place 24x7. Read more...

StorefrontBacktalk
Our apologies. Due to legal and security copyright issues, we can't facilitate the printing of Premium Content. If you absolutely need a hard copy, please contact customer service.