Todd Nemet will be speaking about “Technical SEO” at SearchFest 2010, which will take place on March 9th at the Governor Hotel in Portland, Oregon. Tickets are available now. To purchase, please click the following link.

1) Please give me your background and tell us what you do for a living.

I have been working with Vanessa Fox at Nine By Blue since November of last year. I’m helping her assess company’s search acquisition strategies from a technical point of view and doing other odd jobs — basically whatever needs to be done since we are a start up.

Before that I was at Google for six years. I worked on an AdWords sales engineering team that helped Google’s largest advertisers understand Google’s technology, including AdWords API, Analytics, and Webmaster Tools. I started Google in their enterprise business, running the beta program and managing support for the Google Search Appliance, which gave me a good grounding in Google’s search technology.

I’ve been working with search engines in one form or another since I started at Verity (now Autonomy) in 1994. I’m familiar with enterprise search, site search, and web search. During the dot com years I worked at Inktomi, working with proxy caches and helping to build out a lot of the early content distribution networks.

Nine By Blue: https://www.ninebyblue.com/

2) What questions should a business ask to make sure a prospective CMS is SEO friendly?

The most important thing is to make sure it is configurable with rich APIs and an active developer community. Things change so frequently on the Internet (especially with search engines) that even if you could find a CMS that is ideal out of the box, it won’t be so ideal after a year or so. When Google supports microformats appropriate to your business, you want to be able to add that. When Google changes over to their new News Sitemaps format, you want to be able to support that too. Who knows what other changes will be necessary to remain competitive in the future?

The downside of this flexibility, of course, is that it gives you enough rope to hang yourself through misconfigurations. Also a large developer community can result in a bunch of "SEO" plug-ins that may not be useful for your particular situation. But this is a much better problem to have than not being able to keep up.

3) Please give me some important technical website issues that SEO’s should focus on but frequently overlook.

Crawler efficiency: A lot of time websites appear to be doing fine in search but as I dig into them I realize that the search engines are crawling three or four times more URLs than necessary. Fixing this will result in increased freshness or better coverage or both. Fortunately, there are many techniques available to increase crawler efficiency like redirects, canonical tags, Google Webmaster Tools parameter handling configuration, moving information in URL parameters to cookies, judicious use of the robots.txt file, etc.

Page speed: I am convinced that this will become a more important factor for search engine coverage as time goes on. Speed of the landing page is already being used by AdWords, so it’s not unreasonable to assume that it will be used for natural search soon. But even if it’s not, there are second order effects to having a slow site that will affect how search engines treat a site, such as crawler efficiency and users bouncing back to the SERPs. Fortunately, most sites I work with can greatly improve page speed with just a few tweaks like adding cache control headers, returning 304s appropriately, and enabling compression for all file types (not just text/html).

Leave a Reply

Your email address will not be published. Required fields are marked *