Google Targets Low Quality Sites and Spam
In February 2011, Google, the largest search engine in the world, released what is now known as its Panda Update. Panda was developed to target content farms such as article directories that exist solely for advertising and links to user sites, as well as lower quality sites with thin content that may have been manufacturing an image of quality that did not really exist.
In April of 2012, Google followed several updates of Panda with its Penguin update, further targeting SPAM. This time, it seemed to target issues happening off-site. More specifically, it appeared Google was going after manipulative links that did not qualify according to the standards written into their algorithm. The result has been many SEOs deciding between a choice of complying with web standards and guidelines, or finding alternative employment.
The Path Forward
Today, webmasters and SEOs need to hit several basic technical points without over complicating their code, design or the actual content on a given page.
In general, simple is best with regards to site design and SEO. This does not mean, by any means, that a site must be boring, but that navigation and coding must be crystal clear to ensure a positive user experience and optimal crawling and indexing by the search engines.
At its most basic, site layout should be human friendly first. When naming looking at a page from a human perspective, if an SEO has to ask him or herself why a link is there, or why a navigation menu is titled a certain way, it is safe to assume a search engine bot will also have difficulties.
What an SEO Can and Should Do From a Technical Standpoint
Ensuring that the HTML and CSS, if it is being used, is properly validated is perhaps the most important step when auditing the SEO of a site. Doing this highlights errors and issues of noncompliance with web standards. An error or other coding issue may result in improper indexing of content or crawler errors. While these may not directly affect a site’s ranking, they do play a role in the ability of a site to rank well given other factors both onsite as well as offsite.
One of these important onsite factors is proper title and meta tagging. The title tag is simply a site or page’s title. If there is a specific keyword to target, it is good to get that in the title. The same is true with meta tags, specifically the meta description, which is the description shown in a search engine’s results where your page appears. While these tags are not the difference between being ranked number fifty and number five, they do play a role in search engine rankings and are easy to implement and adjust.
The solution is to ensure that the content is done in proper HTML, with AJAX added to improve the appearance and experience for human visitors, who should be primary focus anyway.
One consideration not discussed is where XML falls into the current SEO best practices. While valuable in the sense that XML’s markup results in a format readable by both humans and machines, it may not entirely be necessary from a content management perspective. Should HTML and CSS be properly validated, AJAX written and implemented correctly, and site structure and navigation laid out simply and with users in mind, attention to XML is often an afterthought.
Users First and Machines Will Follow
There are a great number of technical points to consider at this point. While none of them will earn you a number one ranking on their own, they will ensure your sites are properly crawled and indexed in the search engines, all while providing a great and engaging experience for your readers.
Jennifer Carrigan is a freelance writer who wrote this article on behalf of accuwebhosting.com where you can find shared web hosting at a low price.