Google Panda Update Help Guide – How To Recover Or Prevent Getting Slapped

Your website or blog could be hit by Panda update and you might not even know it or when it happened. Before Google would confirm the update by tweeting it or releasing a formal statement, but now they integrated Panda into the already existing indexing processes. It means that Panda now works within the current algorithm.

You no longer will be able to pin point which update might have caused your search traffic to drop. You basically going to have to start thinking about preventing measures instead of waiting for it to happen and then try to solve the problem.

To do that you need to know what makes your website vulnerable to Panda first. Mostly, it’s about duplicate and low quality content on your domain as well as how original your content is.

Some other factors that might trigger Panda to filter you out include site speed (if your site loads to slow you will have high bounce rate), over optimization techniques, and content farming.

You should also keep in mind that if you are hit by Panda it’s a sitewide penalty. A few low quality pages on your blog could be hurting your entire site.

When Panda was rolled out in 2011 for the first time it was supposed to target sites with low quality content and scraper sites. They did target content farms specifically with Farmer update but I don’t think Google did a good job on getting rid of scraper sites. They exist today and continue to outrank the original source with scraped content.

Anyway, if you follow my blog you know that I have been hit with the unnatural link penalty last year which I was able to get revoked almost a year later. I thought that my site would bounce back immediately but the thing is it didn’t.  In fact, my search traffic dropped right after the penalty was revoked believe it or not.

My first thought was that maybe it takes time but after a month I realized that I am probably also hit by an algorithmic penalty like Panda. So I started researching things and decided to make some changes to the site.

I needed to look at my blog from a different view because it’s difficult to tell yourself, “hey this page doesn’t deliver any value”.

I went through all my previous posts I wrote years ago and found some serious issues. I had pages with no more than 300 words on them! WTF

Ok, now I was on the mission to see what changes I can make to improve my search traffic. It was time to do some real On-site search engine optimization.

So here is what I did so far:

Step 1 – Remove low quality pages

This was hard for me to do because I don’t like the idea of deleting posts from my blog but it had to be done. I went to my WordPress dashboard and started looking through old posts from years ago.

I’ve found a lot of posts that I would never publish today. Mostly crap with less than 300 words that I posted just to rank for some keywords. Yep, I have done it!

I wasn’t a good content writer when I first started this blog. In fact, I sucked at it and you could see right through it. The only thing I knew how to do was how to optimize them and get them ranked.

So, I decided to do some cleaning up and removed almost 40 posts from this blog in the last 2 weeks. Dropped them completely and made sure to serve a 410 error to Googlebot every time it tries to crawl them again.

If you wonder how I did it with a push of a button check out this free plugin. It’s called “410 For WordPress” and you can install it within seconds right from your WP dashboard.

The reason why you want to serve 410 error instead of just leaving those pages as 404 is because apparently 410 is a stronger signal for Google to drop them from index.

You could also try submitting a URL removal request in your Webmaster tools account but that didn’t seem to work for me for individual posts.

The above plugin is great though. It not only lets you add manually which posts you want to serve 410 error for but also it shows you which urls recently generated a 404 error on your domain, so you can add them to the list of obsolete urls with a push of a button.

You need to be careful though. Do not just blindly select all of them because some urls might have operators attached to them and actually exist on your site.

Step 2 – Get rid of duplicate content issues

Believe it or not your WordPress blog generates duplicate pages that might be pulling your entire site down in search results.

I used to let Google crawl my sub-pages (pages that are linked to from the homepage to previous posts). The problem with them is that they use the same title and description as your homepage, well that’s what was happening here on my blog at least.

The only difference is that it adds “Page 2, Page 3…” to the titles. I can see how that’s causing a problem and Google actually tells you about these issues in your Webmaster Tools account. I never really paid attention to it but now I do.

So to tackle that problem I simply went into my Thesis Site Options dashboard and added “noindex,nofollow” to meta tags for all subpages.

I have also added it to tags, archives, and category pages – except category pages I left “dofollow”

My current set up now looks like this:

Another issue I had with my blog’s sub-pages is that spam bots were generating urls with operators such as:

?refer=somespam.com

?refsite=somespam.com

?referencement=somespam.com

Then Google would index them even though I was using canonical urls on them. I wanted them removed quickly so to do that I disallowed Google to crawl that entire directory through my robots.txt file and requested a directory removal through Google Webmaster tools.

To do that you need to click on “Remove Urls” under Optimization tab in your account and then follow these steps:

1. Click on “Create a new removal request”.

2. Enter the full url of the directory you want removed (in my case it was “page”).

3. Select “Remove directory” from the drop down menu and submit request.

Important: Make sure to block Googlebot from crawling that folder in robots.txt file before submitting the request.

After you hit “Submit Request” it will take no longer than 24 hours for them to remove it from search results 🙂

Ok, so now we have removed the spam urls generated by spam bots and the entire folder to avoid duplicate content issues in the future. Nice!

Step 3 – Get rid of keywords in meta tags and remove tag pages (overoptimization issues).

Adding keywords to meta tags used to work back in the day, but now it may actually hurt you so I tried to go through as many posts as I could find that had keywords added to meta tags. I stopped using keywords long time ago but I knew I had some posts from years ago that had them.

Another thing I did was I removed tag pages completely from my blog. Again, they were also causing duplicate content issues so they needed to go. Plus if you use tags nowadays it makes you look like you are doing keywords stuffing.

I removed all of them by going into my WP dashboard under “Tags” and also made sure that Googlebot would see a 410 error when it tries to crawl them again next time.

I didn’t have to submit a removal request for tag directory because I already had them marked as (noindex,nofollow) but I just didn’t want them to appear on my posts at all.

Step 4 – Make sure your homepage is not over-optimized

My blog’s homepage was in fact overoptimized in my opinion. I used to have a lot more static content there with my main keywords within h1 tags 🙂

The thing is that in my opinion Google wants to rank pages not homepages anymore (Yes, Google I get the message).

So I removed the static content and h1 tags. I made the homepage simple and moved the recent posts closer to the top.

Step 5 – Remove broken links

You would be surprised how many broken links you have on your blog inside posts and comments. I got rid of them all by installing the Broken Link Checker plugin.

It’s a very useful plugin because it automatically scans your site for all broken links. Then you can just remove them all with 1 click.

Step 6 – Speed up the load time of your blog

I use W3 Total Cache plugin on this blog but I haven’t updated it in a while and the way I had it configured before was making my blog load slow.

The first thing I did was I turned off Minify of js and css files. There was a problem there for some reason so I needed to take care of that. I also updated the plugin to the recent version. I had issues when trying to update it so I disabled the plugin completely and then made the switch to the more recent version.

Next, I set it to use “Memcached” for Database, Browser, and Object cache. The Page cache I left it at “Disk – Enhanced”.

Plus I am using Amazon Coundfront for my CDN.

After getting everything setup the way I wanted I did a site speed test through PingDom. Today I got the following result:

Keep in mind that before all the changes my load time was about 5-6 seconds!!

I know there is still room for improvement but I will get to it later.

After, I removed all the low quality pages and took care of the duplicate content issues I submitted a request to Google to recrawl my site.

You can do that from your Webmaster Tools account. Simply click on “Fetch as Googlebot” under the Health tab and then hit the “Fetch” button for your homepage.

When it’s done successfully fetching your homepage you can click on “Submit to index” button and tell Google to crawl the page and all linked pages:

Once you do that Google will start recrawling your entire site within 2-3 days.

Here is the graph from my account that shows crawl stats after I submitted my request for Google to see all the changes I have done.

As you can see Googlebot went crazy and started going through all my pages within days. I was glad to see that my page load time was also going down fast.

Now I just need to wait and see if all these changes will actually have an impact on my performance in the serps.

I did notice an increase already though.

Could be just a fluke but I hope it’s going to keep going up from now on.

If you lost your search traffic but can’t figure out why, you should probably go through the steps above and see if that helps. It seems as it helped me and chances are it might do the same for you.

Of course, you can’t have any manual actions against your site. If you are under manual penalty you won’t be able to recover no matter what you do. If you are 100% sure Google took manual action against you then work on getting that removed first.

Then when you get it revoked see where your site ends up in the serps and go from there.

I would definitely check though for duplicate content issues and delete low quality pages that your probably even forgot you have on your site.

Anyway, hope this helps someone out there.

Share

2 thoughts on “Google Panda Update Help Guide – How To Recover Or Prevent Getting Slapped

  1. Wow Pawel! Thanks for this post. I had to bookmark it for later because I got some sites I need to go through. Good stuff!

  2. thanks for this i was suffering from panda previous days now i got the solutions and precautions from panda thanks once again

Leave a Reply

Your email address will not be published. Required fields are marked *