How to comply with Google's quality guidelines in 2020

by Silvia Mazzetta Date: 09-09-2020 seo optimization content marketing

Google provides a set of guidelines on what your website's content should look like in order to appear in search results.

There are several categories within the Google guidelines:

  1. Webmaster Guidelines.
  2. General guidelines.
  3. Content-specific guidelines.
  4. Quality guidelines.

In this article we will focus on the Quality guidelines.

Quality guidelines, describe techniques whose use is prohibited and which, if used, may result in your page or website not being displayed in Google search results.

Quality guidelines

  • Automatically generated content.
  • Sneaky redirects.
  • Linki schemes.
  • Cloaking.
  • Text and hidden links.
  • Doorway pages.
  • Scraped content.
  • Excessive use of keywords.
  • Creation of pages with malicious behavior.
  • Guidelines for spam in user-generated content.

Everything you will find in this article is published in Quality Guidelines in Google support, but we wanted to make a summary of these specifications to help you quickly identify which techniques can negatively affect the positioning of your website.

Automatically generated content

Google aims to offer the user unique and quality content. Creating your own content is costly in terms of resources and time, so one of the easiest and most common practices is to plagiarize it or generate it automatically.

If Google detects automatically generated content, it may consider that an attempt is being made to manipulate the positioning in the search results and apply a penalty.

The texts considered automated are the following:

  • Meaningless text rich in keywords.
  • Text translated by tools in an automated manner without human review and editing.
  • Text generated by automated processes, such as Markov strings To be understood, a Markov string is a sequence of random variables.
  • Text obtained by applying obfuscation techniques or generated with the use of automatic synonyms.
  • Text generated by merging content from several web pages without any added value.
  • Text generated from Atom/RSS feeds or search results.

Some common techniques when generating content automatically include the translation of content in other languages or scrap & spin (copying, fragmenting and recombining in a different order) text strings.

  • The technique of translating texts is based on scrap a content in another language than your own, translate it and publish it on your website.
  • The technique of spinning text aims to extract existing text from other websites and introduce syntactic variations that make it look like "new, original text". This process takes a lot of manual work, since you have to create the syntax variations, although there are also tools to automate it.

Our recommendation is not to use automatic content generation methods on serious websites on which our brand or business depends. Although these actions may work momentarily, we run the risk of being penalized. Quality guidelines are the rules of the game and Google makes it clear that it does not like the automation of content at all.

Sneaky redirects

A redirection is an automatic forwarding by the server from one URL to another. There are many situations where a redirection is the best way to inform Google that a URL has changed and, in this case, it is lawful to do so. For example, when we find duplicate content in several URLs and we want to consolidate it into a single one or if a URL has changed and with the redirection we want to indicate which one is the current one.

But there are cases in which redirections are applied with the intention of deceiving search engines, showing different content to users than to robots. These types of misleading redirects violate Google's quality guidelines and we can be penalized if a detrimental effect on the user experience is detected.

It should be noted that some developers do these redirects consciously for a purpose, but there may also be cases where misleading redirects on mobiles are done without the owners being aware, for example after an attack on the website.

Common example of misleading redirection: Imagine that we perform a search and the same URL appears in the result, both for mobile and desktop devices. The user clicks on the result of the desktop device and the URL opens normally, so far so good. The problem is when the user clicks on the same result in mobile device and instead of landing on the expected URL, a redirection is made to an unrelated URL. You can understand this better in the following infographic.

Link Schemes

Another of the most common infringements are links whose intention is to manipulate the PageRank. These types of links can negatively affect the website.

Some common examples of this attempted manipulation are as follows:

  • Purchase and sale of links with the aim of manipulating the PageRank.
  • Free shipping of products to get them to write about us or exchange services for links.
  • Exchange of links between portals.
  • Automated links
  • Market with articles with anchor texts and keywords on a large scale.
  • Forcing a customer to include a follow link for offering a service. The most common case is when a developer includes in the footer or other element of the website "Developed by name of development company".

The best way to get external links without being penalized by Google is to get other websites to want to link to our portal just because it contains unique, relevant, useful and quality information. Therefore, this content will quickly gain popularity by the users themselves.

Cloaking

The content displayed to users and search engines must always be the same. If this were not the case, it would be considered a cover-up, a punishable case for not complying with Google's quality guidelines.

Examples of cloaking:

  • Configuring the server to display one content or another depending on who is requesting the page. Manipulating content, inserting additional text or keywords when a search engine request is detected.
  • Showing a page with Javascript, images or Flash technology to human users, while showing an HTML page to search engines.
    In general, these techniques of concealment are found to a lesser extent, the search engines have evolved so much that they usually detect and penalize them quickly.

Text and hidden links

Again, we find a fairly common case: hiding content and links in developments. In many occasions it is not done consciously, nor is it known that it is a method of manipulation, but it is and can bring negative consequences, so it is highly recommended to check if this type of hidden content exists and to solve it.

We tell you some common techniques of content concealment:

  • Use CSS to include hidden text with display:none or, for example, to include off-screen text so that users cannot see it.
  • Include white text on a white background.
  • Include text behind images.
  • Setting the font to 0px so that it is not displayed.
  • Hide links in a single character that goes unnoticed or hide it by other CSS methods.

We must bear in mind that hidden content is not always punishable. There are exceptions usually related to accessibility improvements.

If our site uses technologies that make it difficult for search engines to crawl, such as images, Javascript or Flash, it is advisable to add a descriptive text to make this task easier.

Users can also benefit from these descriptions if for any reason they cannot view this type of content.

Examples for improving accessibility:

  • Images: Add "alt" attribute with descriptive text
  • Javascript: Include the same Javascript content in a <noscript> tag.
  • Videos: Include descriptive text about the video in HTML.

Doorway pages

Doorway pages are websites or pages created solely for the purpose of positioning them in the search engine for very specific results, with one or more keywords. Generally this type of landings run the risk of being penalized by Google because it does not consider them good for the user, since they will be very similar results.

These pages are usually created to channel user traffic to the website or home page and are aimed at positioning in search engines, but not at offering a quality result to users.

They are usually low quality pages that do not offer added value to users, in addition to using automated content with slight variations.

The most common use is the creation of door pages to try to position services by city names.

Some examples of doorway pages:

  • Pages to channel visitors to the main, useful or relevant page of the website.
  • Pages with similar content that are closer to the search results than to a clearly defined searchable hierarchy. Having several pages or domain names oriented to specific regions or cities to channel the user to a page.

In order to give a recommendation, it is necessary to evaluate each case in particular. However, if we are penalized for this reason, it must be corrected as soon as possible.

Scraped content


As we have discussed in the point of automated content, we know that generating content is an arduous task that requires many resources to do it with quality. Just as automated content is penalized, so is copying content from other websites.

In addition, in this case copyright is infringed and we can be denounced for it.

Excessive use of keywords

The excessive use of keywords or Keyword Stuffing is one of the oldest practices in SEO. Although years ago it worked, it no longer does. Nowadays, the practice of including keywords excessively in content, links, metadata, etc., fails to comply with quality guidelines and is punishable. We recommend not using this method but rather focusing on generating content that includes the right amount of keywords and synonyms, in an appropriate, natural way and in an adequate context.

Creation of pages with malicious behavior

The creation of pages that behave differently than expected by users, that harm their experience when browsing the website and with a malicious purpose, is clearly another way of violating Google's quality guidelines.

This point is much easier to understand if we look at some examples that are considered malicious behavior and which in many cases you will have suffered:

  • Installation of malicious software on your computer such as Trojans, viruses, spyware...
  • Including unwanted files in the download requested by the user.
  • Confusing the user into clicking on a button or link that does not really do the job that the user believes it does.
  • Changing search preferences or the browser's home page without having informed or obtained the user's consent.

Guidelines for user-generated spam

All the points seen above were related to intentional manipulation techniques created by the website owner himself. Sometimes, it can also be the users who have bad intentions and generate spam on a quality site.

Usually this problem arises in pages that allow adding content in some way or creating pages for the end user.

The main cases of user-generated spam are in:

  • Spam in blog comments.
  • Fraudulent chain posts in a forum.
  • Fraudulent free host accounts.

Pages full of spam give a bad impression to users. It is recommended to disable this feature if it is not useful for users or if you do not have the time to regularly monitor the posted comments.

To avoid this type of spam we recommend:

  • Activate comment moderation and profile creation.
  • Use tools to avoid spam (Honeypot, reCaptcha).
  • Use rel="nofolow" or rel="sponsored" links.
  • If the website allows you to create pages such as profile pages, forum conversations or websites, you can use the noindex meta tag to block access to pages from new or untrusted users. You can also use the standard robots.txt to block the page temporarily: Disallow:/guestbook/newpost.php

A recommendation

Although positioning a website may seem to us to be a hard, costly and long-term job, falling into the temptation to cut corners by applying black-hat techniques, especially if you are inexperienced, can lead to search engine penalties. It is advisable to read Google's quality guidelines frequently to ensure that our website respects them and, above all, to be alert to new recommendations made by the search engine.






Web vector created by stories - www.freepik.com
 
by Silvia Mazzetta Date: 09-09-2020 seo optimization content marketing hits : 5218  
 
Silvia Mazzetta

Silvia Mazzetta

Web Developer, Blogger, Creative Thinker, Social media enthusiast, Italian expat in Spain, mom of little 9 years old geek, founder of  @manoweb. A strong conceptual and creative thinker who has a keen interest in all things relate to the Internet. A technically savvy web developer, who has multiple  years of website design expertise behind her.  She turns conceptual ideas into highly creative visual digital products. 

 
 
 

Related Posts

The Impact of Social Media Engagement on SEO Maximising Results with Link Building Agency

Our daily lives now include social media, and businesses have realised its potential for engaging and interacting with the target audiences. Social media not only makes it easier to communicate…

Use the SRCSET attribute to improve your SEO

There is a new standard HTML attribute that can be used in conjunction with IMG called SRCSET. It is new and important as it allows webmasters to display different images…

SEO: How to choose the best Anchor Text

Anchor Text are the words used to insert a link within a piece of content. This text can be anything from "click here" to the name of a brand or…

Cumulative Layout Shift, what is and How to optimize CLS

Cumulative Layout Shift, one of the new Core Web Vitals metrics,  is the first metric that focuses on user experience beyond performance. Unexpected movement of web page content is a major…

Understanding LCP, CLS, FID. All about Core Web Vitals in Google Search Console

A few months ago we talked about certain Google metrics that were displayed in Search Console. The reason for writing another post on this topic is that Google has changed…

The best free tools for linkbuilding

Linkbuilding is one of the main factors in improving the SEO positioning of a page. Having a profile of inbound links from pages with great authority can mean the difference…

SEO: How to find and remove artificial links

At Ma-no we are aware of the importance of a good linkbuilding strategy in order to achieve success with a website. Links are key to placing a website among the top…

5 Tips to Bring More Traffic to Your Blog

Publishing a blog on your business website is an effective marketing tool for several reasons. Blog posts are the ideal place to share information about your company, products, services, and showcase…

How to Deal with Unnatural Inbound Links

A website that has a good rank on search engines, especially Google is a big task. Backlinks or Inbound links are one of the best ways to achieve this ranking.…

SEO in Google News: How to appear in Google News

Google News is a tool, from Google, that spreads current, reliable and truthful content from different websites or portals dedicated exclusively to news. The sites that appear in Google News have…

5 Remote Careers You Can Start Online in 2020

In 2020, life has moved indoors. School, shopping, entertainment, and work have all moved online to keep up with the fight against COVID-19. And with it came an enormous demand…

Get ready for the future: What Does The Future Of SEO Look Like?

How will SEO change in 2030? There are various positions and assumptions that we can make about this so that you can think about the strategies to take when the…

Clicky