What is a robots.txt file?
Robots.txt is a book document website admins make to educate web robots (regularly searcher engine robots) how to slither pages on their site. The robots.txt document is a piece of the robots Exclusion protocol (REP), a gathering of web gauges that control how robots crawl web, access and record substance, and serve that substance up to clients. The REP additionally incorporates mandates like meta robots, just as page-, subdirectory, or site-wide guidelines for how web crawlers should treat joins, (for example, "follow" or "nofollow").