Google SEO

I first discovered the power of incorporating keywords into websites to boost rankings in 1996. By adding a desirable keyword into the META keyword tag and repeating it frequently in the text, I ranked clients’ websites on popular search engines like WebCrawler and Lycos. It wasn’t simple however, as I was optimizing client websites for 14 search engines, which all had differing algorithms. While much has changed since Google introduced PageRank, many of the fundamentals I refined in the 90s still apply.

I originally shared my philosophy on search engine optimization (SEO) a decade ago with The 3 C’s of SEO. In my article, I identified three primary areas of focus for successful SEO: content, code and credibility. While social media has influenced Google’s algorithm, the 3 C’s still apply today. In this article, I will outline the most common (and timeless) mistakes and misconceptions about SEO.

Content
One of the most formidable challenges with SEO is the creation of fresh, unique, relevant and compelling content. Creating copy, photography, infographics, video or other forms of content is both time-consuming and costly. Over the past few years, Google has dedicated substantial time improving the algorithm’s understanding of natural language. With the BERT update in 2019, context came into play, and marketers were presented with another challenge: supporting search intent. As Google’s search algorithm develops and matures, marketers make a few common mistakes when attempting SEO without adult supervision, most of which centers around quality, relevance, and search intent.

For starters, marketers often mistake the value of quantity vs. quality of content. Nowadays, Google has thousands of page graders who manually review individual web pages, and their major criteria includes E-A-T which stands for Expertise, Authoritativeness, and Trustworthiness. In short, Google wants to ensure that websites within their search engine provide up-to-date, factual information that users can trust. Google places strict responsibility on pages that offer medical information, financial information, or other highly impactful topics. These pages are aptly referred to as “Your Money or Your Life” pages because the information could significantly affect one’s livelihood, happiness, financial situation, or safety. Ecommerce sites are included within this category as a result of customers imputing credit card information. Thus, marketers should avoid generic, unprofessional or generally low-quality content, which includes outdated information, short blog posts, fact-less articles or poor-quality images and video.

When creating new content for a website, too many marketers fail to keep voice search in mind. While voice search can be performed through voice assistants such as Amazon Alexa and Google Home, the majority of voice searches are now completed through smartphones using Siri and Google Assistant. Not only are queries more informal and conversational, but they also tend to be phrased in the form of an actual question instead of 3-5 topical keywords. A related oversight by some marketers is the failure to localize content for different countries, languages or business locations. The internet is both global and local and Google rewards brands that understand this fact.

Even if marketers can create truly compelling content, some are still working off outdated strategies. The biggest mistake website content can make is not supporting search intent. If the information on your page does not provide a valuable resource to what a searcher needs, Google will never offer your site to their users. How can we begin to understand search intent? Break down queries and target keywords to construct your content. Does the searcher need an answer to a question? Is the searcher looking to do research or make a purchase? Developing your content to address the answers will help prove that your website will provide the best possible support to searchers which gets rewarded with better rankings.

Technical/Code
Google cares a great deal about the technical makeup and performance of websites which involves site speed, schema markup, and user experience. Content Management Systems (CMS) such as WordPress and Shopify are incredible tools that allow novice website owners to create complex and personalized websites without coding knowledge or experience. Even so, there are a handful of technical strategies to consider, and to forego any could significantly impact a website’s visibility. First and foremost, the code must be clean, fast, and responsive. Second, a website should always utilize some form of schema markup which not only provides a search engine with coded information about elements on a page, but it qualifies your website to show in Google’s Rich Results which means more attention and engagement.

The most common problem with CMS platforms is that users rely on plugins to install advanced functionality onto a website which comes with large amounts of code that slows down site speed. Site speed is not only important from a user-experience perspective, but poor site speed can impact how many pages a search engine can crawl. Google recognizes and rewards exceptional user experiences, which includes designing with mobile users in mind. Also known as responsive design, mobile-friendly websites render a page differently depending on screen size. Luckily, Google provides several testing tools that provide clear opportunities for improving site speed.

With or without a CMS platform, websites should be designed with CSS, minimal JavaScript and optimized image and video files for maximum speed and usability. Far too many websites rely on outdated versions of HTML or other coding platforms, or have been built or rebuilt over the years, making them slower, less reliable and more difficult to index. I still come across prospective clients that do not have a current XML sitemap or robots.txt files to help guide search engines. Marketers may also not realize the impact of a sites’ architecture and structure can impact rankings. Historically, Google has preferred a flatter site hierarchy, which sometimes directly conflicted with a deeper structure that was more challenging to navigate and spider.

Google is constantly changing how information gets displayed and how users can interact with results. This means that marketers have endless opportunities to enhance the appearance of their website which ultimately helps to stand out against competitors. To qualify, websites must utilize structured data on appropriate pages. The most accepted form of structured data is JSON-LD which stands for JavaScript Object Notation for Linked Data, and there are hundreds of acceptable schema types that websites can use. Currently, FAQ schema is the popular type; however, the other commonly used types include local business schema, product schema, and video schema.

While I haven’t experienced code trickery like cloaking for years, some marketers are still committed to black and grey hat SEO, meaning they are willing to bend or break rules for short-term gains. Unfortunately, Google always catches up and once penalized, websites may not recover for months, if ever. Along the same lines, many sites are not properly secured with an active SSL certificate which means trustworthiness suffers. While we’re on the topic of trickery, it should be noted that Google does not like duplicate content, whether intended or not.

The final area of oversight relating to code best practices, revolves around a website’s metadata. Believe it or not, despite a proactive focus on SEO, many marketers fail to incorporate keywords where it matters most: Title and header tags. While meta descriptions no longer directly impact rankings, they do serve as the elevator pitch to a webpage which means they do impact click-through-rate (CTR). To Google, a website with a lower CTR means it was a bad match or a poor resource to searchers. Additional keyword-optimization opportunities include ALT tags, anchor text and file names.

User Experience
Search engines want to provide their users with the best possible resources available which does not include a website that is slow, unaccommodating, or malfunctioning. Throughout the content development and technical support, marketers often overlook whether or not their website provides a positive experience.

Recently, Google has announced that the Core Web Vitals scores will be considered as ranking signals come May of 2021. The Core Web Vitals scores measure page experience and focus on three major components: loading (CLS), interactivity (FID), and visual stability (CLS). Google rarely provides warnings on algorithm or ranking signal updates, and it would be a mistake to not take advantage of it. It is also important to note that the Core Web Vitals and technical makeup of web pages go hand in hand, so websites with poor technical components will most likely have poor Core Web Vitals scores.

Another major mistake marketers make with regards to user experience – ignoring or unfamiliarity with ADA Compliance. Websites are now responsible for ensuring that websites are fully accessible to all users, and that can be harder than it sounds. For example, visually impaired users rely on screen readers to navigate throughout a website. A button simply labeled “Click Here” makes it impossible for that user to know where that button will lead to. Other common web features that are not ADA compliant include: flashing graphics, videos that auto play without the control to stop it, and images without any set alt-text.

Credibility
While the need for quality content and clean code has not changed in the last 20 years of search engine marketing, credibility factors have changed dramatically. Since Google came on the scene in 1998, with an innovative algorithm that focused on the hub-and-spoke model of authority, SEO professionals have put a good deal of effort into securing inbound links. Unfortunately, too many marketers have forsaken quality links (from popular and reputable websites) for quantity (typically lower quality websites with questionable domain authority).

We’ve known for years that quality trumps quantity when it comes to inbound links. Some marketers are holding out on that insight and continue to purchase links from high domain authority websites or even create or buy into link farms, which has been out-of-vogue for nearly a decade, but still retains the allure for desperate marketers.

While links continue to be a major focus for SEO professionals, there has been discussion around the weighting of inbound links in Google’s algorithm. Recent research unveiled by Stone Temple Consulting at SEMpdx Engage Conference, indicates that links are still a significant factor in the ranking algorithm. The vote of confidence an inbound link (or citation) provides a website is still a key factor and should be considered heavily in marketing efforts.

One area that marketers continue to debate is the impact of SEO initiatives on graphic design, copywriting and coding. In the early days of Internet marketing, I would get into arguments with my interactive agency counterparts about copy, code and design, in which SEO best practices would appear to conflict with design best practices. That issue has largely resolved itself, as Google has become more sophisticated and focuses more on the user experience and high value content. As a result, sites that are beautifully designed with unique content and artful coding tend to out-rank sites designed solely for SEO and not the user.

Credibility covers a host of elements, many of which are unknown or misunderstood by unwashed marketers. One example is domain history, which includes the age of the domain and when it expires. Google likes old domains that expire many years from now, so stop auto-renewing annually and renew every 5-10 years. Domain authority, which is available via Moz, indicates how likely the site is to rank for unbranded terms. A strong domain authority is over 50 out of 100. Off-site factors including quality and quantity of inbound links weigh heavily in the Open Site Explorer domain authority ranking.

A clear majority of businesses have a formal address. Regardless, every business should claim and optimize its Google My Business local listing. The biggest mistake in this aspect is most marketers believe small, local businesses should have an optimized Google My Business profile. However, all businesses and corporations should have an up-to-date GMB profile which is now the top consideration for local SEO rankings. Far too often, marketers overlook the claiming and optimizing of local listings, including Facebook, Yelp and other third-party feed providers for maps and business directories. Another factor related to local listings is reviews. Business reviews can make or break a business, and to take this a step further, reviews must be addressed, particularly reviews on your GMB. Marketers are particularly bad about ignoring bad reviews and not securing a meaningful number of positive reviews, which directly impact revenue, per recent research. Adding one additional star in the 5-star economy, adds 9 percent to topline revenue.

Learn from the mistakes of others and follow best practices when optimizing your website. I’ve included a few helpful SEO resources below, to ensure you are up-to-speed on the latest SEO strategies, tactics and tools.

Leave a Reply

Your email address will not be published. Required fields are marked *