Rating:

# Engraver
Engraver was a hardware challenge on [Google CTF 2022](https://capturetheflag.withgoogle.com/) that was involved recovering movements of a robotic arm engraving the flag from a pcap dump of commands sent to the device.

For the challenge we were given the mentioned pcap containing usb traffic and two images of the robot arm (one of the robot arm idle, one during the engraving process).

## Taking a look at the PCAP
The provided PCAP file contains a dump of USB traffic from a specific usb device. We can see the typical usb configuration packets being sent at the beginning (host fetching the usb descriptors) and afterwards we see a lot of URB interrupts from the host with HID data attached.
![USB traffic](https://i.imgur.com/O8Jz8CF.png)

## Figuring out the hardware part
From the images, we know that this is some sort of a robotic arm, but we have no idea how it is controlled outside of the fact it communicates over USB HID so first order of business is identifying the device.
![Robot arm](https://i.imgur.com/jGJaDvG.jpg)

A bit of googling of keywords like `robot arm` `blue` `arduino` yields some results that look very similar and have a common keyword in them: `xArm`.
After researching the lead a bit more, xArm turns out to be a series of robot arm kits with a common controller. Then by searching github we can find [control scripts](https://gist.github.com/maximecb/7fd42439e8a28b9a74a4f7db68281071) that seem to communicate with the arm through USB HID commands.

## Recovering the instructions from PC
With a control script found, we can compare if the instructions it would send match the data we have in the pcap file:
```python
def move_to(self, id, pos, time=0):
"""
CMD_SERVO_MOVE
0x55 0x55 len 0x03 [time_lsb time_msb, id, pos_lsb pos_msb]
Servo position is in range [0, 1000]
"""

t_lsb, t_msb = itos(time)
p_lsb, p_msb = itos(pos)
self.dev.write([0x55, 0x55, 8, 0x03, 1, t_lsb, t_msb, id, p_lsb, p_msb])
```

We can see that when sending a signal to move a servo the computer sends the following static data `55 55 08 03 01` followed by information about the servo and the move it should make.

This lines up with the data we can see in a sample interrupt packet:
![Interrupt](https://i.imgur.com/3mTo5La.png)

With the data matching up, we can now write a script to parse the servo movements from the pcap with [scapy](https://scapy.net/).
```python
from scapy.utils import RawPcapReader
from scapy.layers.usb import USBpcap

for (pkt_data, pkt_metadata,) in RawPcapReader('./engraver.pcapng'):
pkt = USBpcap(pkt_data)
if pkt.function != 9:
continue # Not a USB interrupt
if pkt.dataLength == 0:
continue # Does not contain any data

data = pkt_data[-pkt.dataLength:]

length, cmd = data[2], data[3]

if cmd == 3: # Servo move
t_lsb, t_msb, id, p_lsb, p_msb = data[5], data[6], data[7], data[8], data[9]
time = t_msb << 8 | t_lsb
pos = p_msb << 8 | p_lsb
print(f'[{id}] T:{time:04}; POS:{pos:04}')
```

With the dumped moves we can notice that while servos 1, 2 and 3 perform movements, servos 4, 5 and 6 receive the same position data every time.

From this we can assume that they aren't used and we can cut down on the amount of data needed to be visualized.

Another thing to note is that servo 1 only oscillates between 2300 and 2400, a slight movement like that could possibly be used to turn on/off the laser pointer.

## Visualizing the data
Now that we know which servos move and how they are moving, we can try to visualize the data.

I opted for visualizing the data in the form of frames that would hopefully contain each character separately.

To separate the frames from each other we can use the servo 6's position setting as it only seems to get re-set after a couple of movements of servo's 1/2/3 have already been executed

```python
from scapy.utils import RawPcapReader
from scapy.layers.usb import USBpcap
from PIL import Image, ImageDraw

positions = {
2: 1000,
3: 1000,
}

img = Image.new('RGB', (1500, 1500), 'white')
imgd = ImageDraw.Draw(img)
draw = False
frame = 0

for (pkt_data, pkt_metadata,) in RawPcapReader('./engraver.pcapng'):
pkt = USBpcap(pkt_data)
if pkt.function != 9:
continue # Not a USB interrupt
if pkt.dataLength == 0:
continue # Does not contain any data

data = pkt_data[-pkt.dataLength:]

length, cmd = data[2], data[3]

if cmd == 3: # Servo move
t_lsb, t_msb, id, p_lsb, p_msb = data[5], data[6], data[7], data[8], data[9]
time = t_msb << 8 | t_lsb
pos = p_msb << 8 | p_lsb

# Skip servos we don't care about
if id not in [1, 2, 3, 6]:
continue

print(f'[{id}] T:{time:04}; POS:{pos:04}')

# Determine if the laser is turned on
if id == 1:
draw = pos == 2400 # If the position is 2400 we're turning it on

# If we're drawing, draw a line from the previously saved position to the new one
if draw:
if id == 2:
imgd.line([
(positions[2], positions[3]),
(pos - 1000, positions[3])
], fill='red', width=20)
if id == 3:
imgd.line([
(positions[2], positions[3]),
(positions[2], pos - 1000)
], fill='red', width=20)

print(positions)

# Servo 6 was re-set so we can save the current frame and re-set the canvas
if id == 6:
draw = False
# Flip the image so it's readable
img = img.transpose(Image.FLIP_LEFT_RIGHT)
img = img.transpose(Image.FLIP_TOP_BOTTOM)
# Save the frame
img.save(f'./imgs/{frame:4}.jpg')

# Initialize a new image for the next frame
img = Image.new('RGB', (1500, 1500), 'white')
imgd = ImageDraw.Draw(img)
frame += 1

# Save the position of the servos for the future - offset for drawing purposes
positions[id] = pos - 1000
```

The following code then produces a series of images that contain the movements of the robot arm:
![Generated frames](https://i.imgur.com/GGiMTbR.png)

From those, we can read the flag characters (some of the characters have parts split between two frames) remembering that the description mentioned all of them being uppercase: `CTF{6_D3GREES_OF_FR3EDOM}`

Original writeup (https://gist.github.com/szymex73/3156e508f24591b64c6b6dd6d009fa57).