Google Farming Update Fallout

An official Google Blog titled: "Finding more high-quality sites in search" written on February 24th, by Matt Cutts and Amit Singhal, announced a significant change in Google’s rankings of Web pages in search results. The blog post stated that this new algorithm update would "noticeably impact 11.8% of Google’s queries."

Referred to as the "Farmer Update" Google was true to its word and that the update seems to have affected many sites search engine rankings. It was pretty clear from the beginning that the new Google algorithm update was targeting content farms, duplicate content and weak or spun content. The big question now is: "Who did it affect?"

Preliminary reports show that sites with a high amount of blatant advertising or those which had a high percentage of obtrusive advertising material seemed to have dropped significantly in rankings. Site that had good layout and design, were of high quality, or considered “attractive” seemed to be the winners. One does have to consider how an algorithm can differentiate a "good" site, opposed to a poor looking or "ugly" site. This one does seem pretty ambiguous as a ranking factor.

Low quality content from user generated content sites, site with content designed to create backlinks and Article Sites such as EzineArticles, Hubpages and Buzzle all lost out. Sites that produced more authentic content or not intended for SEO or link building seemed to have fared better (LinkedIn, Facebook, DailyMotion). Articles have been a major source of link building for many SEO companies for years. This change is forcing SEOs to reevaluate content driven link building strategies for their clients.

Sites that focused on quality content that was usable and valuable did well (Huffington Post for example), whereas sites that were rich in content but was not determined to be user-friendly, or easily readable lost out. This is another indication that Google is continuing to level the playing field by trying to remove the SERPs of webspam and low value content sites.

Certainly other signals were used in the new algorithm to measure the relative success of a site based upon other quality indicators. Signals such as usage statistics; time on site, click-through and bounce rates and page load speed seem to have been employed.
Indicators such as the analysis of the content for readability, for uniqueness and relevancy, as well as for visual attractiveness, all seem to have been additional factors used in determining the overall quality score that has affected so many websites rankings in Google.

Google has always stated that content is king. By focusing on quality over quantity, Google hopes to combat the problem of webspam that has permeated the search results for so long. I applaud Google’s efforts to rid the SERPs of webspam, but I am left with the thought that all the wild fluctuations we are experiencing could have been greatly reduced. With some initial planning and foresight, Google could have greatly reduced the manipulation of the SERPs by users trying to game the search results in their favor.

Comments are closed.