This is a guest post by James Hewson, a freelance writer living in UK and working for a SEO company: RMGSEO. Find out in this post, how james easily managed to avoid the panda farmer for himself and his clients without any loss of traffic, that many other websites are noticing after the Google’s panda farmer update.
I am sure many of you reading this will have heard about the recent algorithmic changes Google has made to its US and now all English-speaking indexes. For those who do not, they were aptly named the ‘Farmer and Panda’ updates – I like to refer to them combined as the ‘Panda Farmer’!
The first update only a few weeks ago (Farmer) was a major change in the ranking algorithm used by Google and impacted around 12% of websites, some large and some small. Following the fallout of this update (a number of high-profile websites were dropped right out of the first few pages) some tweaks were made and a further smaller update was implemented to mop up the mess (Panda).
In simple terms the update attempted to clean up Google’s search results by only showing sites which were worthy to be holding the top spots. Some of the factors included the use of duplicated and scraped content, RSS aggregator sites and general content farms.
To a certain extent I can see how this has improved results for users but in my case the update has not changed many positions that I monitor for my clients.
How Did I Avoid The Panda Farmer?

I avoided the updates for a couple of fundamental reasons;
- All content I create for my clients is detailed, in-depth and no less than 500 words. I always ensure that web pages are robust and can stand up to changes now and into the future (future proofing if you like) through publishing only quality unique content.
- Backlinking campaigns that I oversee for my clients are only directed towards sites with a specific ‘domain authority’ penned by the highly acclaimed seozmoz.org organisation. It provides an effective indicator that any links you are thinking of dropping to a website are only left on websites which have some degree of clout. You can check out the tool here – it is called ‘Open Site Explorer’.
The problems that many websites faced were the devaluing of their links already gained prior to being captured by the Panda Farmer. As a result of the update, many hundreds of thousands (if not billions) of links were devalued because they resided upon websites which the ranking algorithm now deems to be weak.
Coupled with this, on page content which was deemed to contain little content of value was also devalued. With these two factors combined any sites which participated in linking tactics without much due care and diligence applied to the sites they left links on, saw dramatic downward movements in the serps.
There are two important factors here to consider for your website into the future;
1. Quality Unique Content
I view content (and so should you) akin to a writing a great song; if its catchy and memorable it will stand up to the test of time and you will receive royalties (visitors) for many years to come – quality content will never go out of date.
2. Back Linking Campaigns
It is far better to take your time and obtain only quality authoritative links to your website. Using the open site explorer to mine links is quite straight forward and will ensure that you are only sharing your website with respected domains. This will ensure you have a linking profile which is only made up of respected profiles which will sustain your website positions far better.
When you step back and take a look at the changes nothing here is groundbreaking. If you toe the line and maintain the quality in all areas of SEO your website will stand proud today and into the future.
Let me know your thoughts on this simply by using the comment form below. Tell me if you have taken any different approach after the panda farmer update for your website content and backlinking tactics. Plus, I am also interested to know what has been the impact of this Google’s panda farmer update on your blogs/website traffic etc. So again, use the awesome comment form of SmartBloggerz below to let me know ALL THAT!
Speaking of devaluation of links, what then happens if one of the sites contains links that aren’t of any value, while that site is then used to point to other sites of similar content? Or how many links is then considered a huge loss (200? 1,000?)?
Thanks James for writing and helping out with the Open Site Explorer resource. 😀
A good question; this is really how websites enmass end up lower down the rankings. The knock on effect of the link devaluation means that websites which received a link from said website will also receive less ‘love’ from said link and as a consequence lower rankings (all things being equal). The amount of loss is relative to the niche you are in – the larger and more competitive the niche the less effect a larger number of links being devalued will have. If you are in a smaller niche with a typical link profile being in the low 100’s then losses akin to what has happened in the Google Panda update will effect you alot – the trick here is to gather high authority links in the first place and you will avoid the problem!
Got it! Better approach is to still create good content as first priority and know where to build the right links. Thank you for your quickest comments response James!
Hi James,
I feel its a nice step forward by Google.It will really help to clean up the mess that has been created in the recent past and will allow only the unique and quality content to stay on the web.
i think sites with natural backlinks survived
Thanks for explaining this, I’ve been really curious about the Panda update but I’ve found a few great blogs (including this one) to help me better understand what’s going on. Honestly, it seems like what Google is doing is a good thing. From a developer / SEO standpoint I can understand the disappointment in links being devalued, but from a consumer standpoint I really like that Google continues to fight spam and prioritize ‘high quality’ websites. Thanks again for the post and especially for the Open Site Explorer! I’d never seen that before.
Thanks Jon – you are right, all Google are trying to do is return the best quality results for any given search – the concept is not rocket science at all. Abide by their guidelines and apply ethical and construction promotianal methods to your websites and it will stand the test of time…
Hi James,
How you avoided the Google Panda sounds brilliant. I remember reading about this new update from Google but never actually had the time to understand it from every angle until I noticed the top sites with high PRs suddenly dropping down in rankings. I realized then, that, a process so effective has to be thoroughly understood. I feel more well informed now and lets hope that with the help of Google Panda there won’t be much clutter around.
That’s really great to hear that you missed the panda farmer. No loss to boot. This helps when working with my clients to realize that good quality content and backlinks are so important to stay on top.
There are pretty mixed reactions on web regarding Google’s integration of Panda, but, I personally feel it is a good move by Google. Atleast, we can now have certain unique content on web rather than just gazing at content theft and content farms.
I find it very unlikely that Google will start penalising sites that are linked to from “weak” sites, because that is something you have no control over. If that were the case, companies would soon start campaigns to get links to their competitors from poor sites (easy peasy).
I think you are looking at this from the wrong angle; I am not saying here that blasting a ton of low quality links to your competitor will shove them down the rankings because as you say, Google is smarter than that and openly admits these practises cannot be applied to competitors. What I am saying is that low value links will not provide the power they once did which are directed to your website. Over time these types of links which many websites have built there rankings upon are slowing being devalued and as a consequence the rankings falling. My point is to weather any potential storm in the future, build only high authority links which will be resilient to any future changes – this is the process which we adopt and have found no negative effects upon rankings….
sites with which were updated regularly and had good keyword targeting also saved
I quite like your quality content = great song analogy. I think it’s a pretty good move by Google. You deserve your spot in terms of the quality of your site.
Hi James,
I think “Big G” has done a great job.They just clean out all the mess from search engine.Those who steal contents from the original webpages, was getting all the credits instead of actual websites.They even didn’t use any source of content but they were standing still on the front page of Google.So the visitors were redirected to the totally wrong source and I think it was a great disappointment for the visitors and the actual source.I’m really glad that Google had taken the right decision now. And it is very beneficial for visitors and also for the actual websites owner.
Thanks
Lucus
Because of this change in algorithm stolen contents sites was thrown back. I want to say that all copy cats were thrown away. In 500 words articles actual science of the post is explained.