Skip to main content


Today, #facebook bots (Meta-ExternalAgent, you know, the one to train AI) took 21 days of wallclock time and 19.9 GB of bandwidth to index the development version of one of our websites for which crawling makes no sense and is explicitly forbidden by robots.txt.

Maybe it is a signal that we must spend some of _our_ time to make them lose more of _them's_. Any creative idea?