Manual actions, the useful tools in Google Search Console

Put us to the test
Put us to the test!
Analyze your site
Select the database

Let’s go back to focusing on manual actions and the risks of penalization that a site violating Google rules can face: the theme is very delicate and we saw just a few days ago how much attention the US company devotes to the fight against spam to clean up its search result pages and offer users positive experiences.

A focus on manual actions against the site

To guide us in this lesson is the new appointment with the Google Search Console Training series, in which Daniel Waisberg sheds light on a tool contained in the Google Search Console: today we discover something more about the manual actions report, to know what to do if our site is affected by a problem of manual actions that could affect its performance or even its presence in SERP.

Google’s effort against spam and manipulations

“Google is constantly working to improve Search”, says Waisberg at the start of the video, which is why changes to the search engine algorithms are subject to a detailed qualitative assessment prior to the official release. The algorithms are excellent in identifying spam and, in most cases, intervene automatically to delete it from the results pages.

To further improve the quality of search results, Google scans specific sites that do not comply with the policies and guidelines (the quality standards for webmasters): in these cases, a human being can analyze the site and eventually decree a manual action. When this happens, a part or the entire site will lose positions in the rankings or even be excluded and not shown in Google search results.

Guida di Google sulle azioni manuali

Types of manual action

The list of possible problems that generates a manual action includes several prohibited or incorrect interventions and techniques, such as:

  1. Spam generated by users.
  2. Free host containing spam.
  3. Problem with structured data.
  4. Unnatural links leading to the site.
  5. Unnatural links from the site.
  6. Thin contents with little or no added value.
  7. Cloaking and/or redirection commands not allowed.
  8. Pure spam.
  9. Compromised images.
  10. Hidden text and/or use of excess keywords.
  11. Discrepancies in the contents of AMP pages.
  12. Redirection commands not allowed on mobile devices.

In the video, the Googler focuses on some of the main and most common problems, explaining what they are and how to fix them in order to clean up the site and try to recover lost visibility.

Pure spam

It all starts with pure spam, “what many webmasters call black hat SEO“, which includes complex techniques such as the automatic generation of meaningless content, cloaking, scraping (illegal use of contents from other sites) other shady practices.

Thin contents

Google defines as “thin” the low-quality contents that offers information with little or no added value for users. This becomes a problem when a site has a significant amount of low-quality or superficial pages, which do not offer users substantially unique or useful contents and constitute a breach of policies.

Issues with structured data

Manual actions for issues related to structured data are imposed if Google finds that some of the markups of the pages have techniques not allowed, such as content markups not visible to users, markups of irrelevant or misleading contents or other manipulation behaviours.

How to use the Manual Actions Report in GSC

The Google