Tags: web 

Rating:

The solution is pretty simple as this is challenge for 25 points. Crawlers - web crawlers can't see directories denied in robots.txt file. Let's look there
```
User-agent: *
Disallow: /admin/
# flag{mr_roboto}
```
So the flag is
> flag{mr_roboto}