Support

⌘K
  1. Home
  2. Docs
  3. Support
  4. Modifiers
  5. Using Robots.txt

Using Robots.txt

A robots.txt file tells search engine crawlers which URLs the crawler can (or can’t) access on your site.
This is used mainly to avoid overloading your site with requests that might affect its performance.

If you don’t have one, create it using this Link.

After creating it, upload it to your Storage and choose the correct Category :

When initiating the scan, go to the “Crawler” tab in New Scan, and choose it :

That’s it! if you don’t have any other modifiers, initiate your scan.