The E-Commerce SEO Game May Soon Have To Deal With Page Load Speed
Written by Evan SchumanIf an E-Commerce director had to choose a single marketing weapon, there’s little doubt that an effective search engine optimization (SEO) campaign would be among the director’s first choices. After all, the retailer that comes up first when a Google search simply seeks HDTVs or a Yahoo search wants Wii purchase points or a Bing inquiry asks for refrigerators stands to make a generous extra helping of cash.
It’s for that reason the word trickling out of Google saying page load speed will likely become a criteria in how sites rank is noteworthy. Google Software Engineer Matt Cutts spoke at PubCon last month and said that a speed-influenced ranking won’t happen in time for this holiday season, but it’s seriously being considered for early next year.
“We’re starting to think more and more about ‘Should speed be a factor in Google’s rankings?'” Cutts said. “Historically, we haven’t used it in our search rankings, but a lot of people within Google think that the Web should be fast. It should be a good experience, so it’s sort of fair to say ‘If you’re a faster site, maybe you should get a little bit of a bonus.'”
Before we get into whether this change will be a good or bad thing (time saver: it’s a good thing), let’s get the pragmatic points out of the way. Few details have emerged about what kind of weighting load time would be given, but it’s certainly likely to be minimal–at least in the beginning.
Google more wants to say that it’s in their weightings rather than actually punish sites, hoping that those words alone will boost the speed of enough sites to actually make a collective improvement. Under this line of thinking, relevance and size would still get a huge weight, so the Wal-Marts and Walgreens of the world will still get the considerable clicks that their current SEO efforts are delivering.
That said, is the overall effort a good one? Much of that depends on how sophisticated Google—and Yahoo and Bing, which are certainly likely to follow Google’s lead in this area—gets. Will it acknowledge that sites selling games, electronics and video will invariably have more multimedia content, which would slow them down? How will Google deal with huge sites, where averaging all page load times may not make sense?
The stated goal of this effort—to encourage sites to be as fast as possible—is hard to argue with, up until the point that it starts costing companies sales because rivals push ahead of their homepages by a few milliseconds. Will Google weigh these load times based on speed differentials or by sequence? In other words, would there be a rule that says something like “If the speed difference between four competitive sites is not more than XX seconds apart, no site gets an advantage. It defaults to our old criteria”?
Brad Canham, the vice president of business development at Web traffic monitoring site Dotcom-Monitor, argued that this Google change may have a profound impact on the largest E-tailers.
“If a large retailer drops a single ranking in its keyword starting in 2010 because of a slow server response time, it could cost them hundreds of thousands of dollars in a year” and possibly much more than that, he said. “For the first time, this is being brought front and center as a factor for organic search. Therefore, the pressure will be increased for IT directors to monitor and respond to Internet infrastructure elements like server response time even faster to maintain or improve Google page ranking. The bottom line is that, next year at this time, the online retailers with optimized Web pages that are monitoring their server response times are likely to have a leg up on slower competitors.”
Granted, any company that earns its living selling monitoring services has a strong incentive to make such a claim. But if Google acts on its comments, it’s a legitimate consideration. All performance issues have to be layered atop the Web’s—and even the Internet’s—overall traffic patterns. Will Google universally give every retailer a 10-glitch forgiveness pass? It’s sort of like letting all of the students in a class drop their lowest test score before final grades are calculated. If doled out uniformly and consistently, it couldn’t hurt.
If this move forces administrators to spend a little more time considering customer interaction and a little less on launching a site that simply looks impressive, we’re not so sure that would be such a bad thing.
December 4th, 2009 at 8:16 am
Very often performance is overlooked when specifying a web application and is often thought about at the end of the development process when it’s too late or expensive to do anything about. Perhaps now when the benefits of having a high performance web app have a direct impact on bottom line performance will become something that designers and engineers look at from the beginning.
December 4th, 2009 at 9:49 pm
As a consumer I know that if all the ads and other wiz-bang stuff on a page is held up by a slow secondary servers or other outsourced links, I just move on and try another site from the search.
Grins
December 10th, 2009 at 4:27 pm
I’m not going to believe that just by having a faster site than my competitors I’ll rank over them. I personally believe that what google is saying is that if you have a site slower than the average you might get some sort of penalization in your rankings.