What Will Get your Website De-Indexed by Google
When you are searching the Internet for some information, you want the results that you get to be relevant to what you look for and to provide useful information you can trust. That’s what Google wants too – websites that provide high quality content for its users, a content they like and want to share with others. On the other side, duplicate content, or same content used on two or more different pages does not meet these criteria. It only causes mess in the search results, and though there is no penalty for duplicate content, it can still hurt the rankings of you website. This is how duplicate content is important for your website proper SEO.
Is Duplicate Content Always Bad for SEO?
Since using duplicate content on a website and across the Internet through web syndication is mainly considered by Google as trying to manipulate and deceive its algorithms and established optimization rules and influence the search engine results, if you are using it for the purpose of getting more eyes on your content so you would take it higher in the results, you’d be safer if you stop doing so. Google will always penalize spamming and spam-like activity on websites to be able to provide only highest quality links to its users. To find duplicates on your website you should use duplicate content checker PlagSpotter, it instantly finds copies of your web page, automatically scans, detects and monitors your pages for duplicate content
Syndicating your content on other websites may be translated by Google’s algorithms as duplicate content. The syndicator will only be displayed in the additional results – its entries are omitted from the first ones because they are very similar to the results that are already displayed. This may not be the case if the website where you’ve syndicated your content has better site authority as rated by Google which takes it higher in the search engine result pages for the specific keywords.
It is ok for Google to have few slightly different URLs for certain pages, like http://yourwebsite.com/ and http:// yourwebsite.com/index.htm that can be indexed as two pages of duplicate content, but instead are treated as accidental duplicates. Google will simply choose the best URL to represent all others.
If you have specific versions of your website for different platforms or functions, Google will tolerate and won’t penalize this type of duplicate content because it serves a specific purpose the user will find useful. These duplicate pages will only appear in the search results if the user specifically looks for them, including the purpose they serve or the function they have.
How to Deal with Duplicate Content?
Website owners and SEO experts should all avoid using duplicate content, and for the one they already have on their websites they should take actions so they can reduce the load on the search engines and help their sites be better indexed by Google. Here’s what you can do about duplicate content on your website:
- To let the search engine spiders know that you’ve permanently removed the duplicate pages, so they can replace the old URLs in their index with the new ones, use permanent 301 redirects.
- Link to the index pages the same way every time to help the search engines better index your website.
- Ask the websites to include the noindex meta tag on your syndicated articles there, so that this duplicate content won’t be indexed by the search engines.
- Avoid long strings of copyright or repeating boilerplate copy. If you have it, better move the content to a separate page and link to it from your content pages.
- If you have pages on your website that have very similar content, like slightly differing pages for different cities on a travel website, expand their content with more unique and useful information about the topics of those pages.
- Create original and engaging content for your website that your users will find interesting and useful. Regular fresh and unique content is what Google loves the most.
- Use rel=canonical to tell Google which page is the original page and what duplicates should defer to it.