Unless you’ve been living under a rock for the past few months, it’s been impossible to miss the news about Google’s latest algorithm change… the Google Panda update.
In early 2011 the Panda update hit low quality websites that were content farms. Since then, there have been a lot of major sites lost rankings and traffic, plus there have also been several smaller-scale extensions of the Panda update. And to make things worse, Google is promising that there are more changes on the way.
So if you were one of the unlucky individuals to see a drop in your traffic following any of the Panda updates, you are probably wondering how to fix it, right? Or if you’ve been spared through the initial Panda rollouts, you are probably wondering how can you ensure your site will remain safe from future updates?
Understanding the Big G
Before I dive into the history of the Google Panda algorithm update, it’s worth taking a step back and remember what Google is and why it makes the changes it does.
Google is a search engine whose goal is to provide the people who use its service with the best possible results for their search queries. And when they do a good job, more people will use Google over its competitors. So when more people use the Google search engine, Google gets more exposure for its Adwords sponsored listings, which means they make more money.
As you can imagine, Google isn’t happy about people who try to game the system with low quality content that ranks well without providing good value for its visitors. In Google’s eyes, serving up these bad content sites causes people to have a bad user experience.
In order to fight this, Google regularly updates its algorithm, or the way they rank web pages on their search engine. And although the Panda algorithm update is one of the most major changes in the last few years, the reality is that Google is constantly updating their algorithm to fight spammers who are trying to game their system.
According to a post by Amit Singhal of Google:
Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms.
But given the number of sites affected by the Google Panda update and the big changes in traffic they experienced, it’s worth spending a bit of time focusing on what the Panda update is, how it’s influencing current search results and what you need to do to protect your site against future updates to this major algorithm change.
What is the Google Panda Update?
The initial Panda rollout, Panda 1.0, occurred on February 24, 2011 and took aim at the content farm sites that boast thousands and thousands of low quality, user generated pages. Initially, the algorithm update was referred to as the “Farmer’s Update” due to the number of content farms that were affected by the rollout, although Google later acknowledged that the update’s internal name was “Panda” after one of its lead developers.
Google’s estimates put the number of search results pages affected by this update at 11.8% of all the search terms on the web. Among the sites most affected by this initial update were article directories and content farms, which can be seen from the chart below, which was provided by Sistrix.
Although this initial shift garnered the most industry attention, Google has been rolling out updates to the Panda algorithm changes every 4 to 8 weeks since its February launch. The following are the known changes that have occurred this far:
- Panda 2.0 – The first revision to the original Panda update occurred on April 11th, 2011 and expanded the initial rollout from targeting solely US search queries to affecting all English language results around the world. This update also marked the first time Google acknowledged to using data on blocked sites through its Chrome extension and search engine results page “block link” feature to influence rankings. This update was estimated to affect only about 2% of all search queries.
- Panda 2.1 (May 10th, 2011) / Panda 2.2 (June 16th, 2011) – In the two months following the release of Panda 2.0, Google made minor adjustments to the Panda algorithm update. Both of them were reported to affect an even smaller number of search queries than the 2.0 release. The Panda 2.2 update was intended to help fight the problem of scraper sites outranking the original content, but people reported that it wasn’t completely successful.
- Panda 2.3 (July 26th, 2011) – According to a Google spokesperson this update incorporates some new signals that help differentiate between higher and lower quality sites. As a result, some sites are ranking higher after this most recent update.
Given the widespread effects of these changes, there’s a debate as to whether or not the Panda update can be called a true “algorithm change”. Here are the three ways of thinking about Google Panda:
- A traditional algorithm update – with the number of existing sites that experienced losses in the Panda update, there’s no doubt that these changes represent a change in how the Google algorithm regards certain types of content.
- A new ranking factor – similar to PageRank or meta tag optimization, a site’s “Panda score”, based on the metrics quantified in the update, will influence where new and existing pages fall in the SERPs.
- A penalty – maybe one of the most encouraging signs is that sites that implement strategies to correct the issues uncovered by Google can improve their rankings and traffic, making the Panda rollout a temporary penalty directed at low quality sites.
To understand the difference between these concepts, consider what happens when Google adds new pages to its index. Google’s spiders evaluate the new pages based on specific ranking factors (like PageRank) and figures out where they should fall in the SERPs. But just because the order of the results pages may have changed in the end it doesn’t mean that a new algorithm change happened.
Or, to look at it in a more simple way, ranking factors determine where individual pages will fall in the Google’s search results. Algorithm changes affect every page in Google’s index.
Will you be affected by the Panda update?
The combination of these updates to the Panda algorithm, as well as a manual review process that allows webmasters who feel they’ve been wrongly slapped to appeal Google’s decision means that some site owners are reporting a recovery from traffic declines that were initiated by the first Panda rollout.
For example, Danny Godwin of Search Engine Watch reported that traffic is picking up on DaniWeb.com after the Panda slap. His traffic started increasing after implementing several changes, such as reducing page load times, removing duplicate content and improving link structures (including the use of canonical tags and 301 redirects).
However, because updates have been coming out regularly for the last few months and because it’s clear that Google hasn’t yet resolved the issue of scraped content outranking original content in the SERPs, it’s worth paying attention to statements made by Google about what they value in a website. Just because your website hasn’t been hit yet doesn’t mean it won’t be slapped by a future Panda update!
On the plus side, it isn’t that difficult to determine whether your site has a good change to a Panda slap. Matt Cutts and Amit Singhal, who both work at Google, gave some obvious hints about Google’s process of identifying low quality sites in an interview with Wired magazine:
We used our standard evaluation system that we’ve developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?
From the results of these surveys, Google engineers were able to quantify specific metrics that indicate whether or not a site is high quality. These metrics then formed the foundation of the Panda update framework.
If you use these clues, along with the results of the different Panda rollouts so far, SEOs are able to deterimine a number of different criteria that could be measured and used in an algorithm change, including the following metrics listed by Mark Nunney.
- A high percent of duplicate content. This might apply to a page, a site or both. If it’s a site measure then that might contribute to each page’s evaluation.
- A low amount of original content on a page or site.
- A high percent or number of pages with a low amount of original content.
- A high amount of inappropriate adverts (pages don’t match the search query), especially high on the page.
- Page content and page title tag not matching the search queries a page does well for.
To maintain high search engine rankings and protect your site from future Panda updates, or whatever later algorithm changes Google rolls out, it’s a good idea to consider taking the following measures.
- Share only high value, unique content – even if the Panda updates aren’t yet sophisticated enough to fight scraping sites still dominating the SERPs, rest assured that that’s the direction Google is heading. In the long run, providing people with good, original content is the best way to survive.
- Improve page load times – Google has made several announcements related to page load times, so it’s clearly something that’s on their radar now and will be in the future. To improve your page load times, restructure bloated code, compress or resize image files and take advantage of caching plugins if you are running your site on a content management system.
- Build brand awareness with social networking – because links from social networking sites like Facebook, Twitter and Google+ are now being taken into account as a ranking factor, it’s a good idea to increase your presence on these sites. Use them to reach out and connect with your visitors naturally, instead of trying to game the system by spamming links or buying friends.
- Avoid over-optimization – when it comes to ranking well in a post-Panda world, natural is the name of the game. If you have pages on your site that are so tightly targeted to a particular keyword phrase that they’re nearly impossible to read, rewrite them. Or, if you’ve coded every meta and headline tag with your target keyword, consider replacing some of them with terms that create a better experience for your users.
- Share expert content – remember, Google is looking to reward expert sites, and one thing that these authority pages routinely do is to link out to other great content that they feel will benefit their users. When you are writing your new, unique content, start linking out to at least 1 to 2 high quality sites in each article.
At the end of the day, the important thing to keep in mind is that Google values the experience they are providing to each searcher. Any changes you make to your website that improve your visitors’ time on your site will pay off in the long term, both by protecting you from being slapped with a Panda penalty and by ensuring your safety from future algorithm updates.
And if you have already been slapped by the Panda update, don’t blame Google. Just try to provide people with the best information and do what’s best for your visitors, as that will probably cause your site to rank well on Google in the long run.
So what do you think about the Panda update?
PS: If you are having issues with the Panda update feel free to leave a comment or email me. I’ve gotten 8 sites out of it so far.