robots.txt
robots.txt is the conventional file which implements the Robots Exclusion Protocol.
It is used to inform web scrapers and spiders what parts of the website they are allowed to access, if any at all. It can also be used as a form of SEO.
A sample robots.txt which would block access for all web scrapers:
txt
User-agent: *
Disallow: /
Note
That the robots.txt file itself has no power, and relies on spiders to voluntarily acknowledge and respect them. You may still get scraped.