Tags: web 

Rating:

# Robots Rule

This was one of my favorite challenges of the CTF. We're given a website. Let's have a look at it.

```
http://web5.tamuctf.com
```

![](https://raw.githubusercontent.com/shawnduong/ctf-writeups/master/2019-TAMU/images/Robots-Rule-1.png)

Due the nature of this challenge, intuition tells me to check out the `robots.txt`.

```
http://web5.tamuctf.com/robots.txt
```

![](https://raw.githubusercontent.com/shawnduong/ctf-writeups/master/2019-TAMU/images/Robots-Rule-2.png)

This is another reason why it's one of my favorite challenges.

Anyways, it looks like we need to spoof our User-Agent to be a Googlebot. Easy enough. We just need to [pick up a User-Agent](http://useragentstring.com/pages/useragentstring.php?name=Googlebot) and request the website using something like Python. This is a little Python script I made to handle this.

```python
#!/usr/bin/env python3
import requests
agent = {"User-Agent": "Googlebot/2.1 (+http://www.google.com/bot.html)"}
print(requests.get("http://web5.tamuctf.com/robots.txt", headers=agent).content)
```

![](https://raw.githubusercontent.com/shawnduong/ctf-writeups/master/2019-TAMU/images/Robots-Rule-3.png)

I'm totally not a robot.

Original writeup (https://github.com/shawnduong/ctf-writeups/blob/master/2019-TAMU/Web/Robots-Rule.md).