In the past we’ve discussed the importance of unique online content across websites, because if you duplicate any kind of content Google will be able to find it and could penalise your site in search rankings.
But just how does Google find and respond to this kind of duplicate content?
What Google does first is identify where duplicate content is online and collects together all of the URLs that have the same content. It then takes a lot at them to decide where the content came from and makes what is called a cross domain URL selection.
According to the Official Google Blog post about the subject:
“When we discover a group of pages with duplicate content, Google uses algorithms to select one representative URL for that content. A group of pages may contain URLs from the same site or from different sites.”
The URL (or multiple URLs) that have not been selected may then show up much further down in Google rankings, or not even show up at all.
Therefore it’s important that you and your web team understand how to tell Google which is the right URL. However, Google is also launching new Webmaster Tools messages which will tell your web team when its algorithms have selected an external URL other than your website.
Firstly, your web team can add specific tags to content in the code in order to show Google that your site is the one it must look at over all others, whether there’s a spam website stealing your content, or you just have similar content across different country sites.
Here are some of the top scenarios when Google may pick another URL over yours (the right one):
- We advise that your web team does not use duplicate content across websites. However, some people do still use duplicate or similar content, particularly if there are many of the same sites in different languages or regions. To stop Google penalising you for this quite so much, you can use specific pieces of code to signal to Google that this is a multi-regional problem. There’s a specific area of Google’s help pages all about multi-regional and multilingual sites.
- Sometimes there may be hosting errors and Google doesn’t understand that two pages don’t have the same content. This would involve a web team having a look at how your sites are set up and correcting any problems.
- Another common problem is that malicious code may have worked its way into your website somehow. In this instance it can get a little confusing and a web team will need to look through your code to find the problem.
There are lots of instances when your website could get penalised for duplicate content (or at least what Google thinks is duplicate content!) but with the new Webmaster Tools messages it’s easier than ever to be alerted to a problem. The key thing is to have a web team on hand who can quickly and efficiently sort out any issues so that your Google ranking doesn’t suffer in the long run. Here at Codastar we’re used to sorting out all kinds of problems with Google, from duplicate content to malicious code, so give us a call if you need a chat.
[Image via woodleywonderworks’ Flickr]