Beware of Duplications – Here comes the SolutionDCI
One of the major issues in many websites is duplication. They may be either external or internal. Duplication on your own website are unintentional whereas external duplicate issues are intentionally done by the content thieves. First let us see about various internal duplications. Internal duplication usually arises due to the careless mistakes made while developing the website.
a. Various Home page versions
Home page linked to www.website.com/index.html , www.website.com/home.php etc will create internal duplication for home page. If www.website.com/index.html has been crawled by major search engines,then follow the below process to overcome the duplication,
1. Create a new page for the home page for example www.website.com/home.html
2. Then permanently 301 redirect the URL www.website.com/index.html to www.website.com/home.html
3. After redirection make sure that www.website.com/home.html is no where linked from the website.
If the URL www.website.com/index.html is not crawled by the search engines then the issue can be solved by linking the home page to the URL www.website.com/ throughout the website and make sure that home page is not linked to www.website.com/index.html from anywhere in the website.
b. WWW and Non-WWW Version
The www version (https://www.dotcominfoway.com/) and non-www version(https://www.dotcominfoway.com/) will be treated as two different versions of the website. It has to be ruled out by redirecting the preferred domain URL using 301 redirection. Also Google webmaster tools provide an option to choose between the www and non www version, but other search engines do not have such an option to choose between the preferred version. However, 301 redirection will be a better option to pass the credibility to the preferred version.
Case Sensitive URLs
Capitalized and non capitalized URLs of the same page are treated as two different URLs by search engines. For example: https://www.dotcominfoway.com/seo/ and https://www.dotcominfoway.com/SEO/ may be treated as two different pages with similar content which creates duplicate issue for original page. This can be ruled out by permanently redirecting the capitalized URLs to the non capitalized URLs using 301 redirection. Canonical link element rel=”canonical” can also be used to mention the search engines about the preferred URL.
Pagination issues mainly arises in websites developed using Content management system. We can see various pagination issues like read more link in the main blog page, pages with similar title and meta description etc., This creates internal duplicate issues. In order to avoid this we can use Meta robots =”noindex” to avoid search engines from crawling those duplicate pages.
Printer Friendly URLs
Printer Friendly URLs can affect the search engine friendliness of the website when it is not blocked by the robots.txt file. Meta robots =”noindex” can be used to avoid search engines from crawling those pages.
Product Page Duplication
Ecommerce websites have more internal dupe issues when compared to static websites. A single product may have many different URLS with the same content. We can overcome the issue by properly using canonical link element rel=”canonical” and meta robots =”noindex”
All the above issues discussed are about internal duplication and the below issue is dealt with the external website duplication.
There are many content thieves available to steal your content and use it in their websites. You can find the content thieves of your website by using duplicate content finding tools like Copyscape. And if you find any one using your content on their website without a linkback to your website then you can make a DMCA complaint to Google on the website using your content legally.
To conclude, here are few tips you can follow to avoid duplication,
1. Duplication both internal and external has to be monitored frequently. You can monitor the internal duplication by doing a Google search with unique text from your page followed by site:www.yourdomain.com.
2. Proper use of 301 redirection, meta robots =”nofollow”, canonical link element, robots.txt file can help you lot to get rid of the duplication issues.
3. Avoid careless mistakes while building your website URLs
4. Linking the preferred and unique URL for a single page in the website helps a lot to avoid such issues.
5. When building your website with CMS, make sure that you follow unique format for all the URLs throughout the website.