There’s a lot of buzz going on about Google’s 950 Penalty. The problem webmasters are having has been called the 950 penalty, because pages that have traditionally performed well in the SERPs are being shifted from the top-ten position to the nine-hundredth position. The key thing with this so called penalty is that it’s only happening to select pages on a website and not the entire website itself.
A WebmasterWorld.com user with the handle randle was quoted as saying:
Here’s a little summary of our experiences/thoughts so far. I think one of the difficulties in these situations is lots of people have different variations on what happened. Some are suffering from this, others maybe from something else. I have always wondered if that’s by design, it seems so challenging to really pin down concrete, common traits.
Anyhow, this is beginning to prove very challenging; each day that goes by indicates it’s a fundamental shift, and its here to stay. I’m seeing sites we do not own, suffering from this that I never thought would incur a penalty, and then I’m seeing sites I always considered to be living on the edge with no effect all.
- Its got to be a penalty; years on the first page, then in an instant position # 998.
- It’s targeting well searched key words (phrases); ones you must have optimized for, at some point, in some way.
- It’s targeting terms you traditionally have ranked well for. (probably from optimizing)
- Its hitting the site, it seems to drag down the entire site, not all of it, all the way to the 900’s for sure, but overall rankings for interior pages are not doing as well.
- Get hit, and you can go all the way to the end, right up to position 1,000, but not over it.
- For us, it happened to a handful of sites, all at the same time, exactly on the night of January 1st.
Although randle’s hunch is that it might have something to do with over-optimization, it would appear that the user with the handle landmark has a better idea of what is happening — collateral damage from scraping.
The thing that we’ve noticed is that during the penalty period, a search for a snippet from our site shows scraper sites first, with ours last or near the bottom of the list.
The critical thing is that sometimes just one sentence may have been copied from a page that has hundreds or thousands of original sentences. Just that tiny fragment causes the penalty.
Also, in a search for a snippet our page is listed either first or last. It’s as if G decides that if we are not the source of the content then we must be the worst offender possible – the PR0 and scraper sites all get listed ahead of us. It’s like G decides to hit the quality sites hardest. Hard to understand what kind of twisted logic could cause this.
Weirder still – removing the copied snippet from our page has no effect. G “remembers” that your page was penalized and keeps the penalty even though there is no longer any duplicate content.
From reading many of the posts, it does seem that scraping is probably the issue. Specifically, Google being unable to determine which page is the authority for the content, but still deciding to choose one of the pages as the authority. This is common practice, but what’s new is the alleged penalty for the pages that weren’t chosen as the authority.
This began happening after Google’s spam-fighting machine, Matt Cutts, got back from vacation, so it may just be that he’s well rested and ready to continue fighting in the spam wars! It will be interesting to see how this plays out and if Matt decides to discuss it on his blog.