The Advanced SEO track is all about finding those key technical changes and strategies that will help move the needle for your website or campaign.   Two speakers with real-world experience have brought their tool sets and knowledge.  First up is Matthew Brown from Audience Wise, previously with the New York Times.  Second is Todd Nemet, the Director of Technical Projects at Nine by Blue.  This session is being moderated by Scott Hendison from Search Commander.


Matthew Brown, AudienceWise

Consider a CMS that will give you a good Site Architecture.  Don’t pick a CMS that will bully you around. Consider a CMS with a big community such as WordPress or Drupal – both offer lots of flexibility and many plug-ins.

Bad CMS = Bad URLS + Bad Structure
Remember to use 301 Redirects and if needed you can use the Rel=Canonical tag.

Catalog websites can offer a big problem for duplicate content and URLS.  E-Commerce websites that have different sizing and colors mean thousands of URLS that can confuse the search engines. Utilizing a CMS that creates fewer layers is better – this is a growing trend to watch in 2011.

Duplicate content will give the search engines a really good reason to bail out of the site – especially 404 errors – this can be detrimental for deep crawls into your website by the search engines.

Site Speed effects crawler performance and is an SEO ranking factor – it wont make or break rankings but it can be that little factor that helps push your site into better rankings.  Be careful with Java Script changes – Gawker Example meant that the site served up errors to all of their visitors and to the search engines.

Maximizing Exposure
Real-Time Search, Videos, News, Pictures, etc now are all opportunities to show up in the search results.  Think about all of the assets that you can now put to work for your search engine optimization efforts.  If you’re serious about ranking for video content you need a Video Sitemap – this is the equivalent to a Title Tag for traditional SEO.  XML Sitemaps are essential for News and Video optimization – but keep track of the integrity of the XML Sitemaps because errors will cause your site to drop out of the index very fast!

Understand Your Exposure
Using Site:Command is not as accurate as it used to be – The SEOMOZ tool is a much more accurate way to get information. Google analytics will provide you sample data unless you export, which will give you the comprehensive data.  Understand which pages are driving traffic and then capitalize on these opportunities.

Content Farms
The Past – domain strength was a major factor.  Millions of pages were launched, long-tail rankings equaled massive traffic, Demand Media, Associated Content and others were able to capitalize.
Google has made changes and you want to make sure that your website does not get included in this neighborhood.

Even Matt’s mom knows that content farms are bad, thanks to the recent news coverage on the topic!

Google’s patents have been dissected:

  • Does the page contain a lot of links that aren’t fully related?
  • Has the history of the document changed? Are there new owners of the content?
  • Amount of links & anchor links (does it look like ehow?)?

Perfect Market is a company that takes the ‘less is more approach’ – they’re making less navigation, fewer links, smaller photos, and less content.  Is this the new way to conduct conversion optimization?  Mahalo is making changes due to Google’s algorithm change and is cutting down on the number of pages being created.  The BBC is making changes to target 1 page per topic, but that page has all of the content that a user would need – dynamically created data is a new way to public content and give the user a good user experience.

Key Takeaways

  • Less Is More: Pages, Links, Site Levels – bring the important content to the top!
  • Maximize your exposure by utilizing RSS Feeds, Keyword Targets, and Multimedia.
  • Content Farms – you don’t want to be one, work on providing value to the user.

Bonus: Read Matthew Brown’s Mini Interview for more tips!


Todd Nemet, Nine by Blue

Great information is available through our log files.  This is an old way to look at data and make marketing decisions.  Old-school tools exist such as AWStats, Webalizer and custom projects to build your own.
Nine by Blue web log parser can tell you the data that you want to see to make marketing decisions.

Crawling Inefficiency
By pulling the Top Crawled URLS you can get an idea of the more popular pages being indexed.  Inefficiency can be fixed by changing how your Sitemap files are organized.
Problems with load balancing can mean that 500 Response Codes are being served often – this can be caused by many redirects.

Dealing With Scrapers
By looking at log files we can determine if people are scraping content from your website.  If many of the scrapers are eating up bandwidth, block them.

Discovering Near Duplicate Content
Looking at Parameters such as “ListSortOrder” means that multiple pages have the same content.  Utilizing the Rel=Canonical Tag is a good solution.

Poor Error Handling
When an ‘error’ page returns a 200 Error code instead of a true 404 Error code you end up with several URLS that don’t have any content on them and will be confusing to the search engines.  Soft 404 Errors are a strategy to stay away from.

Metrics to Consider

  • Most Crawled pages by bots
  • Time to discovery for articles
  • New referring links
  • Crawl Efficiency (304, params), Number of pages crawled
  • Crawl Errors (403, 50x), Email list of errors each night to engineering team

Bonus: Read Todd Nemet’s Mini Interview for more tips!

Leave a Reply

Your email address will not be published. Required fields are marked *