Is duplicate content harming your site? Unless you’ve been pro-active, that may well be the case. You’ve probably never deliberately put up two pages with the same content, but your site may still contain identical pages for a number of technical reasons.
It’s a problem you need to deal with. Google began seriously dealing with duplicate content very seriously in first Panda algorithm update in 2011 and the process has been ongoing. Google’s recent Panda 4 update continued to act against duplication and seriously affected some established web sites.
How to Avoid Duplicate Content
Duplicate content can occur for a range of reasons. One of the most common arises from page and post naming in CMSs such as WordPress. One major cause happens because of the way Google recognises web pages. Thus www.yoursite.com, yoursite.com, yousite.com/index.html, and yoursite.com/ would all point to the same page but be counted as separate URLs in Google.
You can handle the issue through the use of canonical tags which tell the engines which page you’d like to appear in the SERPs. Ask your Sydney SEO agency for more details.
Adding a string such as to the head section of your page also helps with the duplicate page problem caused by parametered URLs. These are strings tacked onto the end of base URLs by analytics programs and other systems. This also happens when ecommerce sites use session IDs which to track customer behaviour.
Likewise you should avoid publishing pages with similar based on keywords such as ‘best ceramic tiles’, ‘top ceramic tiles’ and ‘quality ceramic tiles’, for example. Once this technique was effective but now is likely to raise red flags. It’s now better to create one authoritative page based on the questions users are asking.
Content Duplicated on Other Sites
Ideally the content on your site should be unique. On occasions, you may wish to distribute your content via syndication or some other means. Unless you take steps to prevent it, if another version exists on the web, Google may index that even if yours is the original.
Duplication can also happen when unscrupulous webmasters use bots to scrape content via your rss. One way to help with this is to set your rss feed to publish only excerpts. Another strategy is to use a plug-in that automatically inserts your link into any scraped. However you have to be wary of getting links from bad neighbourhoods.
Dealing with duplicate content issues is a major step in keeping your site Google compliant and hopefully ranking well in the SERPs. But no one knows what Google will do next. This means it’s wise to entrust your search engine optimisation needs to a professional firm who constantly keeps up with the latest policies and practises of the search engines.
Contact Smart SEO for advice on all your content marketing needs.