Google search results have been the holy grail of web marketing. Consumers and businesses naturally go to their computers, tablets, and phones to search for products and services. It might surprise you to know that about 60% to 70% of them use Google to search. So showing up in the top of the search results for keywords relevant to your business is crucial. Unfortunately, Google often changes their algorithms, so if you search high on the results one month, a couple months later your results may slip lower in the rankings.
An entire industry has grown up around trying to figure out how Google ranks web site links based on keyword relevance. Google’s secret sauce is the study of many smart people. And with any formula for determining a search engine ranking once people start to figure out the formula, they use those methods to figure out ways to boost their rankings.
Famously JC Penney several years ago during Holiday seasons ago was penalized by Google for having a link building campaign that had links on sites that were barely web sites and were not relevant to bedding or clothing or anything related to JC Penny – and they ranked very well. Google realized there had to be a better way to rank web sites and ensure that the rankings were accurate.
In order to combat this type of spam. Google started a series of updates to their search ranking algorithm (the secret sauce). The updates were all focused on the goal of reducing poor quality links and poor quality content. The bad practices of the SEO companies that were basically forced upon them by the nature of Google’s old algorithm had to be stamped out. These updates began with two sets of ongoing updates that have become known as Penguin and Panda.
Panda was a change to the search ranking algorithm which was first released in 2011 and has had many subsequent updates since it was released. The thought was that the newer updates would continue to improve the impact of the change. The goal of Panda was to reduce the rankings of sites with little valuable content or those that were more focused on advertising and had very little deep content. The initial impact of this update was sites that had scraped and copied content was ranking well even though the actual content was low quality. Google released tips on building high-quality sites – (read here). Since it’s initial update, Panda has reduced the number of web sites that had flown under the radar and now were required to provide good, solid content for their readers in order to rank well.
The Penguin algorithm updates that were first announced on April 24, 2012. The goal of the update was to penalize the search engine rankings of web sites that practiced techniques such as keyword stuffing, cloaking, unusual linking schemes, and the creation of duplicate or scraping content. These are all violations of Google’s WebMaster guidelines (read more here). There have been several updates since the original release with recent ones focusing on content “above the fold” to ensure there isn’t “ad stuffing” and more.
Where it used to be acceptable to use keywords in an unnatural way, Google’s Penguin algorithms require content to be readable and for the keywords to make sense when they are used. If keywords are used in ways that are deemed as “stuffed” – using keywords in sentences where they don’t make sense or as lists in the alt tags – they will not rank well.
Google is forcing owners of web sites to build quality sites with quality content that are fast, easy to use, popular with quality links coming in. and in short valuable to the site visitor. In the long run, the consumer wins! If you’re going to take time out of your day to visit a web site, it only makes sense that the site will provide some value to you, and that’s what the Panda and Penguin algorithms are working toward.
Quick update… Make sure you keep up with the latest in Penguin and Panda from Matt Cutt’s – Google czar of search. You can see his latest blog and video post on what to expect. Some of these changes have already began to roll out.