Crawler directives, also known as robots directives or robots.txt directives, are instructions given to web crawlers or search engine bots to regulate their crawl behavior on a website. These directives are typically implemented through a robots.txt file placed in the root directory of a website and can specify which parts of the site should be crawled or not crawled, as well as other constraints like crawl delay.