Shell Game: Who Should Handle Payment Duplicate File Checks?
Written by Evan SchumanAs new details emerge from Shell Oil’s late January IT glitch that caused some $12 million in duplicate payment-card charges for thousands of its Shell retail stores, the incident is looking less like a store-and-forward situation and more like a glitch with the oldest technique in the retail book: a file of charges handled through batch-processing. This is something that retail IT should have gotten down cold decades ago, a fine example of a process issue.
Shell initially said the double charges were the result of an outage caused by an AT&T telco problem. Then when AT&T declared that it hadn’t suffered any outage, Shell said the outage was internal. Then it appeared that there wasn’t any type of an outage, but merely a file of card charges submitted twice.
After promising interviews with IT managers on Monday and then on Tuesday, Shell changed gears on Wednesday. “The cause of the system issue is considered confidential,” E-mailed Shell spokesman Theodore Rolfvondenbaumen, only a few minutes after he said a different Shell manager was handling the interview arrangements and that he had no update on the status.
But the file-based approach explains much. Shell had said the incident happened on January 29, while customers have reported seeing the duplicate charges on January 29 and January 28. Had it been a file of charges on the 29th, it would almost certainly have included charges from the 28th. Asked if that was the case, Rolfvondenbaumen E-mailed: “That is an assumption. The system issue occurred on the 29th.” If the system issue in question was the double-submission, that fits.
Andy Orrock, a payment consultant for Online Strategies who tracks these types of payment issues, said his reading of the Shell details—including a review of the First Data confidential memo describing the incident that StorefrontBacktalk shared with Orrock—is that it appears to be “a file-based screw-up, meaning that we’re probably talking about credit and offline [PIN-less] debit.”
He added: “Somehow, somebody [at Shell] injected a file into the system twice.”
The problem with that, though, is the same issue raised in the store-and-forward concerns. “Somewhere along the line, these systems are supposed to have duplicate file-checking,” Orrock said. “How robust of a duplicate file check do they have in place?”
The first question, though, is to define “they.” Should these types of a checks have been at the Shell level or the First Data level? Or both?
Checks like this can be done at the retailer level, too, where file submissions are checked against earlier submissions. If something isn’t handled properly, it could be missed. Theoretically, the best place for such duplicate checks to be performed is at the processor. But that’s expensive, and who should cover that cost?
One answer is for more systems to be automated at the retail level, such that there are fewer steps for a human to screw up. That, too, costs money.
As Shell knows, the cost is not merely the additional processing or the hard costs of reimbursing customers who had to pay bank charges for their debit cards. It’s the hit to the chain’s reputation.