Unless you’ve been living under a rock for the past few months, you couldn’t miss the news about Google’s latest algorithm change, the Google Panda update.
In early 2011, the Panda update hit low quality websites that were content farms. Since then, a lot of major sites lost rankings and traffic.
There have also been several smaller-scale extensions of the Panda update. And to make things worse, Google is promising to have more changes.
If you were one of the unlucky individuals to see a drop in your traffic following any of the Panda updates, you are probably wondering how to fix it, right? Or if you’ve been spared by the initial Panda roll-outs, you are probably wondering how you can ensure your site will remain safe from future updates?
Understanding the Big G
Before I dive into the history of the Google Panda algorithm update, I’d like to take a step back and review what Google is and why it makes the changes that it does.
Google is a search engine whose goal is to provide the people who use its service with the best possible results in response to their search queries. When it does a good job, more people use Google over its competitors. When more people use Google search engine, Google gets more exposure for its Adwords sponsored listings, which means it makes more money.
As you can imagine, Google isn’t happy about people who try to game the system with low quality content that ranks well without providing good value for its visitors. In Google’s eyes, when it displays these bad content sites, it creates a bad user experience for its customers.
In order to fight this, Google regularly updates its algorithm or the way it ranks web pages on its search engine. And although the Panda algorithm update is one of the most major changes in the last few years, the reality is Google is constantly updating its algorithm to fight spammers who are trying to game its system.
According to a post by Amit Singhal of Google:
Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms.
Given the number of sites affected by the Google Panda update and the big changes in traffic they experienced, it’s worth spending a bit of time focusing on what the Panda update is, how it’s influencing current search results and what you need to do to protect your site against future updates to this major algorithm change.
What is the Google Panda Update?
The initial Panda roll-out, Panda 1.0, occurred on February 24, 2011, and took aim at the content farm sites that boast thousands and thousands of low quality, user-generated pages. Initially, the algorithm update was referred to as the “Farmer’s Update” due to the number of content farms that were affected by the roll-out. Google later acknowledged that the update’s internal name was “Panda” after one of its lead developers.
Google’s estimates put the number of search result pages affected by this update at 11.8% of all search terms on the web. Among the sites most affected by this initial update were article directories and content farms. The chart below, provided by Sistrix, details the top 15 affected domains.
Although this initial shift garnered the most industry attention, Google has been rolling out updates to the Panda algorithm changes every 4 to 8 weeks since its February launch. The following are the known changes that have occurred thus far:
- Panda 2.0 – The first revision to the original Panda update occurred on April 11th, 2011, and expanded the initial roll-out from targeting solely US search queries to affecting all English language results around the world. This update also marked the first time Google acknowledged using data on blocked sites through its Chrome extension and search engine results page “block link” feature to influence rankings. This update was estimated to affect only about 2% of all search queries.
- Panda 2.1 (May 10th, 2011) / Panda 2.2 (June 16th, 2011) – In the two months following the release of Panda 2.0, Google made minor adjustments to the Panda algorithm update. Both of them were reported to affect an even smaller number of search queries than the 2.0 release did. The Panda 2.2 update was intended to help fight the problem of scraper sites outranking the original content, but people reported that it wasn’t completely successful.
- Panda 2.3 (July 26th, 2011) – According to a Google spokesperson, this update incorporates some new signals that help differentiate between higher and lower quality sites. As a result, some sites are ranking higher after this most recent update.
Given the widespread effects of these changes, there’s a debate as to whether or not the Panda update can be called a true “algorithm change”. Here are the three ways of thinking about Google Panda:
- A traditional algorithm update – with the number of existing sites that experienced losses in the Panda update, there’s no doubt that these changes represent a change in how the Google algorithm regards certain types of content.
- A new ranking factor – similarly to PageRank or meta tag optimization, a site’s “Panda score”- based on the metrics quantified in the update, will influence where new and existing pages fall in the SERPs.
- A penalty – maybe one of the most encouraging signs is that sites that implement strategies to correct the issues uncovered by Google can improve their rankings and traffic, making the Panda roll-out a temporary penalty directed at low quality sites.
To understand the difference between these concepts, consider what happens when Google adds new pages to its index. Google’s spiders evaluate the new pages based on specific ranking factors (like PageRank) and figures out where they should fall in the SERPs. But just because the order of the results within SERPs may have changed, it doesn’t mean that a new algorithm change happened.
Or, to look at it in a more simple way, ranking factors determine where individual pages will fall in the Google’s search results. Algorithm changes affect every page in Google’s index.
Will you be affected by the Panda update?
The combination of these updates to the Panda algorithm, as well as a manual review process that allows webmasters who feel they’ve been wrongly slapped to appeal Google’s decision, means that some site owners are reporting a recovery from traffic declines that resulted from the first Panda roll-out.
For example, Danny Godwin of Search Engine Watch reported that traffic is picking up on DaniWeb.com after the Panda slap. His traffic started increasing after he implemented several changes such as reducing page load times, removing duplicate content and improving link structures (including the use of canonical tags and 301 redirects).
However, because updates have been coming out regularly for the last few months and because it’s clear that Google hasn’t yet resolved the issue of scraped content outranking original content in the SERPs, it’s worth paying attention to statements made by Google about what it values in a website. Just because your website hasn’t been hit yet doesn’t mean it won’t be slapped by a future Panda update!
On the plus side, it isn’t that difficult to determine whether your site has a good change as a result of a Panda slap. Matt Cutts and Amit Singhal, who both work at Google, gave some obvious hints about Google’s process of identifying low quality sites in an interview with Wired magazine:
We used our standard evaluation system that we’ve developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?
From the results of these surveys, Google engineers were able to quantify specific metrics that indicate whether or not a site is high quality. These metrics then formed the foundation of the Panda update framework.
If you use these clues, along with the results of the different Panda roll-outs so far, you will be able to determine a number of different criteria that could be measured and used in an algorithm change, including the following metrics listed by Mark Nunney.
- A high percentage of duplicate content. This might apply to a page, a site or both. If it’s a site measure, then that might contribute to each page’s evaluation.
- A low amount of original content on a page or site.
- A high percent or number of pages with a low amount of original content.
- A high amount of inappropriate adverts (pages don’t match the search query), especially high on the page.
- Page content and page title tag not matching the search queries a page does well for.
To maintain high search engine rankings and protect your site from future Panda updates, or any future algorithm changes by Google for that matter, consider taking the following measures:
- Share only unique, high value content – even if the Panda updates aren’t yet sophisticated enough to fight scraping sites still dominating the SERPs, rest assured that that’s the direction Google is heading. In the long run, providing people with good, original content is the best way to survive.
- Improve page load times – Google has made several announcements related to page load times, so it’s clearly something that’s on its radar now and will be in the future. To improve your page load times, restructure bloated code, compress or resize image files and take advantage of caching plugins if you are running your site on a content management system.
- Build brand awareness with social networking – because links from social networking sites like Facebook, Twitter and Google+ are now being taken into account as a ranking factor, it’s a good idea to increase your presence on these sites. Use them to reach out and connect with your visitors naturally, instead of trying to game the system by spamming links or buying friends.
- Avoid over-optimization – when it comes to ranking well in a post-Panda world, natural is the name of the game. If you have pages on your site that are so tightly targeted to a particular keyword phrase that they’re nearly impossible to read, rewrite them. Or if you’ve coded every meta and headline tag with your target keyword, consider replacing some of them with terms that create a better experience for your users.
- Share expert content – remember, Google is looking to reward expert sites, and one thing that these authority pages routinely do is link out to other great content sites that they feel will benefit their users. When you are writing your new, unique content, start linking out to at least one or two high quality sites in each article.
At the end of the day, the important thing to keep in mind is that Google values the experience it is providing to each searcher. Any changes you make to your website that improve your visitors’ time on your site will pay off in the long term, both by protecting you from being slapped with a Panda penalty and by ensuring your safety from future algorithm updates.
If you have already been slapped by the Panda update, don’t blame Google. Just try to provide people with the best information and do what’s best for your visitors. It will most likely help your site rank well on Google in the long run.
So, what do you think about the Panda update?
P.S. If you are having issues with the Panda update, feel free to leave a comment or email me. I’ve gotten 8 sites out of it so far.