Tags: web s3 aws 

Rating: 3.0

View on github: https://github.com/quintuplecs/writeups/blob/master/CSAW-CTF-Quals-2020/web-whistleblow.md
# Whistleblow
## CSAW CTF Quals 2020 | Web
#### @jeffreymeng | September 13, 2020

**Problem Statement:**

> One of your coworkers in the cloud security department sent you an
> urgent email, probably about some privacy concerns for your company.

**Hint 1:**

> Presigning is always better than postsigning

**Hint 2:**

> Isn't one of the pieces you find a folder? Look for flag.txt!

**Attachments:** (1)
File named `letter` with text:

> Hey Fellow Coworker,
>
> Heard you were coming into the Sacramento office today. I have some
> sensitive information for you to read out about company stored at
> ad586b62e3b5921bd86fe2efa4919208 once you are settled in. Make sure
> you're a valid user! Don't read it all yet since they might be
> watching. Be sure to read it once you are back in Columbus.
>
> Act quickly! All of this stuff will disappear a week from 19:53:23 on
> September 9th 2020.
>
> \- Totally Loyal Coworker

One interesting thing about this problem is that no link is given to us. If not for the first hint, this challenge would probably be a lot harder (though the problem statement offers a clue: `cloud security`, which implies aws or some other cloud computing company).

When we google `presigning`, the first (non-dictionary) result is AWS S3. AWS S3 is basically a storage service by amazon that stores your files for you.

Reading a bit about AWS S3 presigning, we learn that it is a way to generate a URL that allows users to make GET requests to retrieve files from a bucket. The link is secured using some url parameters.

At this point, a few things in the message become clear. Sacramento and Columbus likely refer to two different AWS regions (`us-west-1` and `us-east-2`, respectively), because they are the capitals of the states which these regions are based in. The date given in the message (a week from 9/9/20 19:53:23) likely refers to the expiration date of the presigned url. The hex string `ad586b62e3b5921bd86fe2efa4919208` is likely the bucket id.

We don't know what to do with the expiration date yet, but we can first try accessing the bucket. Thus, we make a GET request of the format `https://<bucket-id>.s3.<region-code>.amazonaws.com`. You could use a curl, a REST client, or just enter the URL into your browser.

We first try the region `us-east-2` (there's no particular reson why I chose this over `us-west-1`).
When we load `https://ad586b62e3b5921bd86fe2efa4919208.s3.us-east-2.amazonaws.com/`, we get a response in the form of XML.
```
<Error>
PermanentRedirect
<Message>The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.</Message>
<Endpoint>ad586b62e3b5921bd86fe2efa4919208.s3-us-west-1.amazonaws.com</Endpoint>
<Bucket>ad586b62e3b5921bd86fe2efa4919208</Bucket>
<RequestId>8Y4REZ2Y1KDJAY0M</RequestId>
<HostId>b5cOBOl16udqCywyFe9lBMpuO5f2HOuz/qq+0Vr72FfmDRHY8ejzDP/iCdprjZArMWLxs6SgKsA=</HostId>
</Error>
```

This message basically states that we should be using region `us-west-1`. However, GETting `ad586b62e3b5921bd86fe2efa4919208.s3-us-west-1.amazonaws.com` returns an access denied error. At this point, we refer back to the message. Specifically, it mentions `Make sure you're a valid user!`.

Something interesting about AWS S3 is that they have a permission setting that makes the bucket available only to `Authenticated Users`. This seems like it would be more secure, but it actually means *any* user with an account, regardless of whether they are associated with the account that created the s3 bucket.

At this point, it's necessary to use the AWS cli, which can be installed [here](https://aws.amazon.com/cli/). First, run the command `aws configure` using credentials which you can obtain by logging in to (or creating) an aws account. Create an IAM user and grant it the `AmazonS3FullAccess` permission.

*(It was at this point that I made my first mistake. At first, I didn't realize that I needed to attach the permission, and so when I tried to use the CLI, I was getting `Access Denied` errors, which I thought were related to the challenge itself, but was actually because my credentials didn't have permissions to run s3 commands.)*

After configuring our aws credentials, we can list all the objects inside the AWS s3 bucket with `aws s3 ls s3://ad586b62e3b5921bd86fe2efa4919208`.
This returns a large list of folders with meaningless names. To include the files within the folders, we add the `--recursive` flag.

This returns a large list of text files nested within folders.

It's probably infeasible to look inside all of these files manually, so instead we can copy them all to our computer using `aws s3 cp s3://ad586b62e3b5921bd86fe2efa4919208 ./ctfawsfiles/ --recursive`, and then concatenate all of these files, while preserving a newline, with `wk 1 .ctfawsfiles/**/*.txt`. This gives us all of the file contents without us needing to manually open each file.
Basically, most of the files contain random strings of equal length, but four files stand out.

The interesting lines are
```
s3://super-top-secret-dont-look
3560cef4b02815e7c5f95f1351c1146c8eeeb7ae0aff0adc5c106f6488db5b6b
.sorry/.for/.nothing/
AKIAQHTF3NZUTQBCUQCK
```
These look like parameters for AWS pre-signed urls. This [aws documentation page](https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-query-string-auth.html#query-string-auth-v4-signing-example) tells us what we need, and how, to create a presigned url.
The first line is a bucket ID, likely containing the flag.
The second one looks like an SHA hash, likely a signature for the presigned url.
The third one is a folder path, but based on it's name, it also could be nothing.
The fourth one is aws access key ID.

Referring to the second hint, it mentions that one of the pieces that we get is a folder. This is probably `.sorry/.for/.nothing/`, and it mentions to look for flag.txt. Thus, the flag is probably at <bucket id>/.sorry/.for/.nothing/flag.txt

This contains four prices of information that we'll likely need to create a presigned url: bucket id, signature, key id, and path to the flag.

However, we're still missing the Date and expiration values. The text file mentions that we will have a week from `19:53:23 on September 9th 2020`. This likely means that the presigned url (and thus the date parameter), was created on `19:53:23 on September 9th 2020`, and the url is valid for a week.

The [aws documentation page (same as above)](https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-query-string-auth.html#query-string-auth-v4-signing-example) handily tells us how to construct our URL.

Path: https://super-top-secret-dont-look.s3.us-east-2.amazonaws.com/.sorry/.for/.nothing/flag.txt

X-Amz-Algorithm=AWS4-HMAC-SHA256

X-Amz-Credential=AKIAQHTF3NZUTQBCUQCK/20200909/us-east-2/s3/aws4_request
*(this is created using the key id, the date from the text file, and the region it tells us to access it from -- columbus i.e. `us-east-2`)*

X-Amz-Date=20200909T195323Z
*(This is just the formatted version of the date above)*

X-Amz-Expires=604800
*(One week in seconds)*

X-Amz-SignedHeaders=host

X-Amz-Signature=3560cef4b02815e7c5f95f1351c1146c8eeeb7ae0aff0adc5c106f6488db5b6b

Thus, our final url is https://super-top-secret-dont-look.s3.us-east-2.amazonaws.com/.sorry/.for/.nothing/flag.txt?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAQHTF3NZUTQBCUQCK/20200909/us-east-2/s3/aws4_request&X-Amz-Date=20200909T195323Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=3560cef4b02815e7c5f95f1351c1146c8eeeb7ae0aff0adc5c106f6488db5b6b.

This is just a text file with one line, the flag.
```
flag{pwn3d_th3_buck3ts}
```

Original writeup (https://github.com/quintuplecs/writeups/blob/master/CSAW-CTF-Quals-2020/web-whistleblow.md).