There has been no official word from Google about an update. A lot of people have been grumbling in the forums however about something going on. When Barry commented on this Google stated that there was No Major Update but that the algorithm is always changing. Google may be changing how they detect bad links. If I understand it right, the idea is that instead of devaluing bad links in bunches every time Penguin refreshes, Google is devaluing bad links as they crawl.
We are wondering if Google is starting to put into use the information they are getting from the disavow tool. So, let’s say that a whole pile of websites have included spammy articles.com in their disavow.txt file. Google evaluates the site and decides that it only exists to provide spammy back-links and as such devalues all links that are coming from this site. I have no proof for this, but it’s a possibility. We all know that Google has algorithms to detect bad links and the evidence of this was very visible earlier in 2012 but with this latest change Google seems to have developed the capability to continually detect and devalue bad links as part of the standard crawl process, something they were never very good at before. The net result of this is that as Google completes a crawl of a website and the sites linking to it they can devalue those links in real time rather than waiting to roll out a major update such as Penguin or a manual link penalty.
A lot of sites that are suffering are ones that may have been penalized before due to bad links and have not yet removed them all but there are some that are losing ground from a strong position which indicates the latest change is not confined to sites that were hit in the first half of 2012. Google has made it clear that the disavow tool works on a crawl basis whereby all sites submitted will be crawled before the link weight can be cancelled out so there is every reason for them to have merged this capability with the “bad site detection” algorithms they already have in place. Realistically Google has to move towards a continual process because manual penalization are not scalable in the long term and they much prefer algorithms over people.
I have only included the past few weeks data rather than the full year because I didn’t want to identify any sites in particular. We would be interested to hear from you in the comments if you have noticed any sites on a gradual slide recently and whether they had previously been penalized or not?
Google has informed me that there was no update.
“It’s not Panda. We asked around a little bit and didn’t uncover any other likely suspects, but remember that Google launches over 500 algorithmic changes per year, so there are always small changes rolling out.”