Kind of Crawlers
There are several forms of Google crawlers.
Here are different forms of crawlers–
- Common Crawlers: These kind of Crawlers are generally responsible for indexing different kinds of content. They also work for search testing tools, internal Google product team, and subjects related to AI.
- User-triggered fetchers: These are bots that are triggered by users, which includes fetching feeds or site verification.
- Special-case Crawlers: Bots of this category are for special cases like – mobile ads webpage quality checks or for push notification messages via Google APIs. Notably, this Crawler does not follow the global user agent directives in robots.txt highlighted with the asterisk (*).
Google-Safety Crawler
The documentation in the Special-case – Crawlers Google-Safety by Google’s processes – aims to find malware. The Special-case Crawlers completely ignores all robots.txt directives.
What Google said on the new documentation
According to Google, “The Google-Safety user agent handles abuse-specific crawling, such as malware discovery for publicly posted links on Google properties.This user agent ignores robots.txt rules.”
Full agent string for the Crawler:
"Google-Safety"
Read the new documentation for the Google-Safety user agent on the Google Search Central page for crawlers in the section devoted to Special-case crawlers.