Host had problems last week - robots.txt fetch > High fail rate last week

If your website is submitted to Google Search Console, then you get to see Host had problems last week - robots.txt fetch > High fail rate last week error, how can you solve this error?

Host had problems last week -  robots.txt fetch > High fail rate last week

Why this error comes and how it can be corrected is given below.

Step 1: - When you check in Google Search Console Setting > Crawl stats > Host status then you get to see Notice in Alert Red Color in Host had problems last week.


Step 2: - We are getting error on Robot.txt fetch here, it is saying that the robot.txt file which is there is not on the website, if it is then it is not able to read it, then you check that example. com/robot.txt file is opening or not.


Step 3: - High fail rate last week per click, you can see that the date where the graph is going like above, it was checked last time when it did not find the robot.txt file, due to which we got this error Got to see

Step 4: - Whatever your website is, you have to search by entering robots.txt in the last domain of that domain, if you do not find the file, then you have to create robots.txt file by typing the code mentioned in the image above.

Create robots.txt file

User-Agent: *
Allow: /
Disallow: /cgi-bin/
Sitemap: https://example.info/rto-list/sitemap.xml

Post a Comment

Previous Post Next Post