Google's Policies: Duplicate Content Penalty | Mixed Content Penalty (Block) | Meta Description..

8 replies
  • SEO
  • |
This post is intended to make clear Google's policies and provide the resources pertaining to Duplicate Content Penalty vs. Mixed Content Penalty (Block) vs. Meta Description Duplicates. At least, not in the way a lot of people mean when they post.

Note: Google Policies may change after of the date of this writing. Please check the listed resources for any updates.

Duplicate Content Penalty

Duplicate Content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin.

Google Webmaster Central Blog
Official news on crawling and indexing sites for the Google index
Demystifying the "duplicate content penalty"
Friday, September 12, 2008

Let's put this to bed once and for all, folks: There's no such thing as a "duplicate content penalty." At least, not in the way most people mean when they say that.

OP.. The main take-aways are:
  1. Google does not penealize a site with content published on a different site.
  2. Google may decide not to index the site and/or page with content published on a different site.
  3. Google gives credit to the first site with original and unique content. However, Google may decide to index a page "higher" with content from the original source. News for example.
Mixed Content Penalty (Block)

What Is Mixed Content?
Google Developers Web Fundamentals

Mixed content is when a secure web page (loaded through HTTPS) also contains scripts, styles, images or other linked content that is served through the insecure HTTP protocol. This is called mixed content.

Chromium Blog
No More Mixed Messages About HTTPS
Thursday, October 3, 2019

Today we're announcing that Chrome will gradually start ensuring that https:// pages can only load secure https:// subresources. In a series of steps outlined below, we'll start blocking mixed content (insecure http:// subresources on https:// pages) by default. This change will improve user privacy and security on the web, and present a clearer browser security UX to users.

Google Chrome Will Handle Mixed Content

Currently Google loads pages with mixed content. Beginning in December 2019 with the introduction of Chrome 79, Google will do two things:
  1. Google will automatically upgrade http content to https if that resource exists on https.
  2. Google will introduce a toggle that a Chrome user can use to unblock insecure resources that Chrome is blocking.
Although this isn't a full blocking, it might as well be because users may opt to back out of a site that displays a security warning.

This will be a bad experience for publishers and may lead to less sales, visitors and ad views.

OP.. Search Engine Journal has an excellent article about Mixed Content and what we can do.

Google announced that Chrome browser will begin blocking web pages with mixed content beginning December 2019. Publishers are urged to check their websites to make sure there are no resources that are being loaded using the insecure HTTP protocol.

OP.. The word 'content' in the words 'Mixed Content' does not refer to article content. A lof of people on this forum confuse Duplicate Content and Mixed Content with one another since both phrases are so similar.

OP.. Mixed Content 'only' involves non-secure and secure protocols, i.e. http vs. https.

OP.. A little insight - if your thinking that your website will not be impacted wherein scripts, styles, images or other linked content are stored on a http website and are rendered on a https website. End result: Blocked!

OP.. Also, any and all scripts, styles, images or other linked content that are copied from a http website A and are to a https website B. End result: Blocked! Copy-n-Paste website to website is an all to common practice.

OP.. And last, but not least. It is questionable where a website started out as an http website and later added https then any or all of the original http scripts, styles, images or other linked content MAY be blocked. The reason it is questionable is because some server admins in my camp did some extensive testing and discovered this simply because not a servers are the same.

Meta Description Duplicates

Create good titles and snippets in Search Results

Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. The goal of the snippet and title is to best represent and describe each result and explain how it relates to the user's query

Make sure that every page on your site has a meta description.

Differentiate the descriptions for different pages. Identical or similar descriptions on every page of a site aren't helpful when individual pages appear in the web results. In these cases we're less likely to display the boilerplate text. Wherever possible, create descriptions that accurately describe the specific page. Use site-level descriptions on the main home page or other aggregation pages, and use page-level descriptions everywhere else.

If you don't have time to create a description for every single page, try to prioritize your content: At the very least, create a description for the critical URLs like your home page and popular pages.

6 Mistakes to Avoid When Writing Your Meta Descriptions
Search Engine Journal May 2, 2018

Even though meta description duplicates won't get you penalized, you should put together unique meta descriptions for every page for practical reasons.

According to Google:
Good meta descriptions are short blurbs that describe accurately the content of the page. They are like a pitch that convinces the user that the page is exactly what they're looking for.

Since meta descriptions act as "a pitch" in the SERPs, you can use them to control the narrative around your site. The better you deliver your message with unique copy, the better chance you have of driving more traffic to your pages and increasing CTR.

Google's Matt Cutts Is it necessary for every page to have a meta description? November 18, 2013

Though Matt Cutt's video is dated 2013 it all stands to this date.

The main take-aways are:
  1. Meta Description Duplicates are not penalized. However, duplicates may be blocked.
  2. Google will create a snippet for pages without a meta description and the snippet may not be exactly what you want it to be when displayed in the SERPs.
  3. Google Search Console indicates pages without meta descriptions.
  4. Meta Descriptions better the chance you have of driving more traffic to your pages and increasing Click Through Rate (CTR).
Conclusion.

Most tools to help us with all of the above are located in Google Console and many very good ones are in the listed articles.

If you made it this far..

Hope it Helps and Have a Great Day

P.S.
KFC is best when served hot vs. served cold? What say you?
#block #content #description #duplicate #duplicate content penalty #google #meta #mixed #penalty #policies
Avatar of Unregistered
  • Profile picture of the author Jeffery
    Additional links that are intended to address relevant topics:


    The thread that prompted this thread. Note all of the incorrect responses from members that are giving advise bassed on faulty assumptions and members regurgitating the same or similar faulty assumptions:
    Same Content - Multiple Sites I Own


    The thread that helps beginners setup their web host cPanel before a website is installed on a domain:
    Tips to establish an online business step-by-step.
    {{ DiscussionBoard.errors[11548787].message }}
  • Profile picture of the author KylieSweet
    Google already knows that 20% of the content in the web is duplicate so those procedures are well implemented since then.
    {{ DiscussionBoard.errors[11548900].message }}
  • Profile picture of the author Jeffery
    Another caveat that often causes confusion pertaining to Duplicate Content that impacts a main domain with subdomain(s).

    When content in full or part is published across different domains (Main and subdomain) the process is classified as cross-domaining or cross-posted content. This is a normal practice by webmasters on blogs wherein excerpts of an article are utilized to support a separate article on a separate page(s).

    By default Google Policy the Google Bot will not penalize the page with the cross-posted content and will not index the page. There is a easy fix.

    All we have to do is use the Cross Domain Rel=Canonical. This will also maximize the SEO Value of Cross-Posted Content.

    The Google Console simplifies the process somewhat when the console is working and other elements of the website are also in working order. In my experience it is best to use the Cross Domain Rel=Canonical before publishing cross-posted content to avoid Console problems.

    Using the Cross Domain Rel=Canonical to Maximize the SEO Value of Cross-Posted Content - Whiteboard Friday.
    {{ DiscussionBoard.errors[11548911].message }}
  • Excellent informative article. Thanks!
    {{ DiscussionBoard.errors[11548998].message }}
  • Profile picture of the author RyGuy2019
    I have a website structure that is numbered, so each page is content-01.html, content-02.html.

    Every page is different with different stuff but Google flags my whole site as duplicate content.

    I can't figure out how to restructure my site to get more traffic. Domain has major authority, but I get nothing.

    It feels like there are a 1000 hidden penalties on my websites or something and I just can't figure out what they are.

    Any help would be awesome.
    {{ DiscussionBoard.errors[11549627].message }}
    • Profile picture of the author MikeFriedman
      Originally Posted by RyGuy2019 View Post

      I have a website structure that is numbered, so each page is content-01.html, content-02.html.

      Every page is different with different stuff but Google flags my whole site as duplicate content.

      I can't figure out how to restructure my site to get more traffic. Domain has major authority, but I get nothing.

      It feels like there are a 1000 hidden penalties on my websites or something and I just can't figure out what they are.

      Any help would be awesome.
      How do you know they flagged your whole site as duplicate content? I have never heard of them sending anyone a message like that.
      Signature
      SEO, AdWords Management, Social Media Marketing, and more.
      Get a FREE Quote.
      {{ DiscussionBoard.errors[11549664].message }}
    • Giving every page an URL that is related to the content surely could improve your rankings. But if there are any penalties, it is more likely to be related to your backlink profile or things like that. Check out if there any bad backlinks pointing to you.
      {{ DiscussionBoard.errors[11550246].message }}
Avatar of Unregistered

Trending Topics