1-800-Flowers’ Web Tweak Accidentally Grows Thorns
Written by Evan SchumanThe programming change unintended consequence has become as much a part of E-Commerce life as Page Not Found. But 1-800-Flowers discovered last month the latest iteration, when a valid attempt to accelerate customers’ Web experiences on Valentine’s Day made the site appear to have crashed to some firms, causing no end of headaches.
The move itself certainly made sense. The retailer tried moving non-customer traffic—such as search engine bots and agents trying to track a variety of Web performance elements—to some slightly slower servers, with the intent of making the experience much faster for paying customers. And if a bot for Bing, Yahoo or Google get its data a little bit more slowly, no harm done. That was the theory.
But slowing down performance for a Google spider, for instance, will in turn cause Google to assume your site is slower and that could impact a retailer’s ranking on the search giant. It also caused some companies that track—and report publicly—Web performance to see the retailer’s Web presence as slow or, in this case, down completely.
The IT team at 1-800-Flowers initially thought the reports of the site being down involved a carrier hiccup, but they soon turned their attention to internal server changes. This is an unidentified 1-800-Flowers IT team member, who was commenting on an outage report issued by uptime tracking firm Pingdom, according to an E-mail sent from Joe Pititto, the investor relations VP for 1-800-Flowers.
“On [February] 11th, we moved our traffic to run out of our primary datacenter. If [Pingdom’s] bot traffic was pointing to our backup datacenter or any of the its associated cells, they would have seen an issue,” said the IT person. “Pingdom monitors our site using a special user agent called the Pingdom.com_bot. It is possible that some of our cells have differential treatment for this user agent, as we have a number of redirect rules for different user agents for mobile users, search engines, etc. If Pingdom were to monitor our site using a standard user agent like Internet Explorer or Firefox, they would not have encountered this problem. To prove this theory though, we will have to get into an interactive troubleshooting exercise with Pingdom. This might also explain why sometimes Gomez sees us as down, when in fact we are alive and well.”
Pingdom responded that its research into the problem came up with the same conclusion. “Clearly, the Pingdom user agent is routed differently by your systems. Responses/load times are much slower (several seconds versus less than one second) than if we use the Firefox user agent, and we see this consistently from all 25 locations,” wrote Pingdom’s Peter Alguacil. “So, I’m guessing you’re redirecting them to different servers, and the one for ‘bots’ (depending on your criteria) is performing very poorly. So in this (very rare) case, Pingdom doing the ‘polite’ thing and clearly identifying itself to your servers actually made a performance difference.”
The “polite” reference is a point made by multiple Web performance firms, and it may be the heart of this issue.