One of the more interesting articles from the world of search this week was this carried by the Wall Street Journal covering how HubPages have loosened the Google Panda Death Grip. In that article the HubPage CEO claims he saw improvement in the rank of his articles on HubPages once moved onto a sub domain. This was first brought to my attention by David Harry who runs the SEO Dojo and Reliable SEO:
This was before the Wall Street Journal broke the story, so there was no way of knowing who the site was. David explained in another update that in this case the site owner had apparently noted improvements in rankings of pages that were nuked by the Google Panda update once those articles were moved onto sub domains. I thought my response was intelligent enough – follow me on Google + if you want more intelligent content like this
The HubPage Fix
HubPages are going to move all their content onto sub domains. This structure can work for them as they will create a sub domain per author e.g.
This allows Google to devalue sub domains with rubbish content but won’t end up penalizing all sub domains due to the mistakes of a few. This is a drastic measure, but for sites that were held up as the poster boys for the Google Panda Update, they need to do something drastic:
Will it Work ?
The core question for this strategy is “How are sub domains treated since Panda was unleashed ?”. The last real informative post on Sub Domans and Sub Folders was from Matt Cutts in 2007, where he suggested sub domains were being treated LESS like independent sites. But from an SEO perspective, I have always thought of sub domains has a separate entity that need to build up their own trust and authority. I think a lot of how Google treats your sub domain may be dependent on how heavily you interlink your sub domains with your core site (but that is speculation on my part).
If HubPages do role this out for all authors, surely their content will not return to it’s old positions as the equity, trust etc built up by the core domain will not trickle through the sub domains as it would have for sub folders. The sub domains will need to garner their own link profiles and it will be a lot more difficult to rank sub par articles based on domain strength (which is what Google wants).
The Only Solution
This may work for HubPages, but how long and to what extent won’t be known for some time. I am not sure running around discussing it is the best option their CEO has ever taken (EGO vs Performance), it always reminds me of the line from ShoeMoney “Don’t make Google look stupid”. My original suggestion that moving to a new domain isn’t that far fetched in terms of how sites may have to deal with the Google Panda. The Panda Update is not part of the core algorithm and therefore sites who are affected have to try clean up their act, wait and hope the next time it’s run, they are given some of their traffic back. To do this sites will have to figure out what content is affecting their core domain. They can do this my moving to sub domains and shutting down those that under perform or try moving all the great content to a new domain and starting again. Not great options and really depends on how much of a brand your site is.
- Google Farmer / Panda Update – Not about Links – All About Link Building
- Using Google PPC to dominate your competitor domains
- Google Freshness Algorithm Update – A Round up from the Web
- Canonical Tag – A Solution to Duplicate Content
- Track SubDomains Part 2 :: Google Analytics :: link(), linkByPost()