Course
The <code>robots.txt</code> file is used to instruct web spiders on how to crawl a website, often to prevent them from indexing confidential information. As a pentester, examining and visiting the "disallowed" pages in this file can reveal sensitive information.
Ready to practice?
Get access to this lab and 600+ hands-on exercises with a PRO subscription.