D’oh! Not-so-best SEO practices

This article originally appeared on Adotas, written by John McCarthy, Director of Search Engine Optimization of WebMetro/Godwin

ADOTAS – While attending a recent trade show, I provided “live” SEO site assessments for a number of companies. A typical live SEO site assessment involves a brief Q&A about the goals of the site with the company followed by 10–15 minutes of targeted SEO research and analysis. My SEO research and analysis examines core on-page and off-page optimization ranking factors including site architecture, page construction, content, link popularity and web server configuration.

For the most part, the companies assessed were following SEO best practices for achieving high organic rankings as well as maximizing traffic opportunities. However, looking back over the trade show, I recall a number of companies that were not following SEO best practices. Without naming any specific brands, let me share three bad habits discovered in my site assessments that are not-so-best SEO practices.

1. More Domains the Merrier

Most companies I met at the trade show had multiple domains. Owning multiple domains is a good thing as it protects against trademark infringement and supports a reputation management campaign.

As a best practice we recommend clients purchase multiple combinations of their company name, product names and common misspellings. With each domain name, we also recommend clients purchase the various different extensions (e.g., domain.net, domain.info, domain.org) as well as the country codes (e.g., domain.ca, domain.co.uk). Once clients have purchased these alias domains it is also a best practice to either park the domains or implement a 301 redirect to the primary domain if there is no plan to make them live.

While most companies adhere to these best practices at the trade show, I found a company that did not. In my audit I discovered a company had 27 live domains. The problem was each of these domains shared the same content and shared hosting on the same server. This situation created two SEO problems simultaneously: duplicate content and domain spamming.

Although duplicate content is not a ranking penalty, it tends to suppress organic rankings. In addition with 27 domains all sharing the same content, this company risked the search engines interpreting this behavior as domain spam and could result in getting banned. Since the search engines value unique content and don’t like spam, I provided the company three recommendations:

1. Create unique content for each site to support a reputation management campaign;
2. Redirect the alias domains to the primary site via a permanent redirect (also known as 301 redirect); and
3. Implement a No Index, No Follow instruction for the alias domains

2. Blocking the Spiders

Next up for an SEO assessment: a company that redesigned its website about 3 months ago. The marketing director proudly showed me the website’s new navigation, video testimonials and blog. While the company loves the new site, its traffic was dropping and requested an SEO assessment to see if I could identify the problem.

As part of our Q&A, I asked if they had removed pages from the site and transferred link popularity from the old site to the new site. The response: “The redesign kept all the existing content and URLS, and it was essentially a ’re-skinning’ of the site to refresh the look and add elements such as video and a blog.”

I then checked Google’s index for the company and found that Google was displaying only 50 pages of a website that had over 300 pages. I examined the Robots.txt file for the site and immediately discovered the problem. When the company launched the new site, a robots.txt was uploaded that inadvertently informed the search engines to exclude the site from indexing. After a few times of accessing the robots.txt file, the search engines started dropping the previously indexed pages from their index — which correspondingly resulted in a drop in traffic. Not good.

It turns out that blocking the search engines from crawling a website is not uncommon. Two years ago we had a client that did this — accidentally, of course. Within four weeks, Google dropped 800 pages from the client’s index. In this situation it was very difficult to find the problem because the client implemented the do not index command on interior pages of the site via the content management system and not on the robots.txt file.

In general it is a good best practice for sites to implement a Robots.txt file on the web server. This informs the search engines of what files and directories to index on the site and which files and directories not to index on the site. In the past couple years the Robots.txt file application has been extended to include reference to XML site maps to support site indexing.

3. Expiring Domains

During one of my assessments I found an e-commerce retailer whose website and SSL certificate were about to expire in less than 30 days. Once the domain name expires, the website will drop from the global DNS and appear as if the website no longer exists. Of course, once the SSL certificate expires, a whole bunch of other interesting things can happen.

Further research discovered the retailer’s domain was registered to the design agency that built the website. Unfortunately this company was no longer doing business with the design agency. As a result they did not know if the domain would be automatically renewed or simply expire.

I explained that search engines like Google look at the domain registration as part of their ranking algorithm. Sites with long domain registration periods such as five or more years are perceived to have more value as the domain will not expire soon. This makes good sense from an organic ranking perspective. Why would Google want to rank a website as a top website that will expire in three to four weeks?

While these three examples sound really basic, I continue to see these types of not-so-best practices occur with big and small websites several times a year. Best practices are just that: best practices. Even basic best practices are critically important to SEO campaigns — as evident with these three site assessments.

Scroll to Top