Dear AEC evaluator, thank you for agreeing to evaluate our paper's artifacts! Below, you will find the steps to execute our code and reproduce our results (wherever possible).
We used a 15 mins. campus trace dataset to perform many of the evaluations in our paper (Figure 6
, and Figures 9
—14
).
The access to our campus traces is restricted by the Institutional Review Board (IRB) and the Institutional Review Panel for the use of Administrative Data in Research (PADR).
Unfortunately, this means that we are not allowed to share these traces with anyone who is not an approved signatory on the IRB and PADR applications related to this study.
This also means that we are not allowed to store these network traces anywhere except our institutional servers.
We try to work around this limitation by requesting the AEC evaluator to evaluate our code against a publicly available network trace instead of the 15 mins. campus trace mentioned in our paper.
While we make our P4 code public, the Tofino SDE package required to execute this code is owned by Intel and we are not allowed to share it publicly. What we do instead, is set up an Amazon AWS EC2 instance with the Tofino packages already built and installed. In this section of the README, we explain the steps to execute to log in to the EC2 instance, execute our P4 code, and reproduce Figure 8 of our paper.
-
Download the SSH key attached in the HotCRP comments. It is a text file called
sigcomm22-paper67-aws-key.pem
. -
Move this file to your system's
.ssh
directory by executing in a terminal:
mv sigcomm22-paper67-aws-key.pem ~/.ssh/
- Change the file permissions of the SSH key file to one permitted by AWS, by executing in a terminal:
chmod 400 ~/.ssh/sigcomm22-paper67-aws-key.pem
- Log into the EC2 instance by executing in a terminal:
ssh -i ~/.ssh/sigcomm22-paper67-aws-key.pem ubuntu@ec2-54-82-111-53.compute-1.amazonaws.com
You should be logged into the EC2 instance as user ubuntu
and should be automatically cd
'd into the directory ~/bf-sde-9.7.0
.
Please reach out to us via HotCRP comments if you face any issues during login and we will try to resolve them immediately.
Since there are multiple AEC evaluators, it is possible that more than one reviewer might try to execute our code at the same time. This will not work since multiple instances of the Tofino model and the Tofino switch-driver cannot be launched at the same time. Therefore, it is necessary to check whether other AEC evaluators are active once logged into the EC2. Please perform both the following steps (1 & 2) to confirm that you are the only AEC evaluator active at the moment.
- Please execute the following from a terminal inside the EC2 instance:
ps -ef | grep run_tofino_model | wc -l
If the response is 1
, you are the only active AEC evaluator; please proceed with 2.
If the response is 2
or more, someone else is actively executing the Tofino processes and you should try later once they are finished.
If they seem to be active for too long (e.g., hours), it is possible that the evaluator forgot to kill the processes once they completed their evaluation; please reach out to them via HotCRP or otherwise in that case.
- Please execute the following from a terminal inside the EC2 instance:
ps -ef | grep run_switchd | wc -l
If the response is 1
, you are the only active AEC evaluator; please proceed with Step 3.
If the response is 2
or more, someone else is actively executing the Tofino processes and you should try later once they are finished.
If they seem to be active for too long (e.g., hours), it is possible that the evaluator forgot to kill the processes once they completed their evaluation; please reach out to them via HotCRP or otherwise in that case.
-
We have open-sourced our Tofino prototype's P4 code in this repo. The code is in the
prototype
directory. We have cloned this repo in the EC2 instance already at~/sigcomm22-paper67-artifacts
, and the code is up-to-date. Feel free to executecd ~/sigcomm22-paper67-artifacts && git pull
anyway if you wish to ensure that this code is the same as the one in the GitHub repo. -
We used the 15 mins. campus trace to produce
Figure 6
in the paper. As mentioned before, we are not allowed to share this trace due to IRB and PADR protections. We request the reviewer to proceed toFigure 8
instead, since we can share the data for it, and it allows for the evaluation of our prototype just asFigure 6
would.
- The network trace we use to produce
Figure 8
in the paper is a trace captured inside our campus. We initiated a BGP interception attack using the PEERING testbed for this experiment, as described in the paper. This trace captured communication between a host in our campus and a remote host on the US West Coast. The attacking host was located in Amsterdam and the attack was initiated sometime during the course of this communication. We are able to expose this trace to the AEC evaluators because the trace captures communication between hosts we control, and not to/from other hosts in our campus network (that data is protected by the IRB and PADR as explained before). However, using an abundance of caution, we do not wish to publicly share this trace since it still captures real user-traffic. We therefore place this trace locally in the EC2 instance but not in the GitHub repo. It is located at~/sigcomm22-paper67-artifacts/pcaps/interception_attack_trace.pcap
. Please refrain from sharing this trace with anyone due to the reasons stated above.
If you choose to change directories during Step 3 (to perform git pull
, etc.), please change back to the SDE directory before the next step by executing cd ~/bf-sde-9.7.0
.
- From inside the
~/bf-sde-9.7.0
directory, execute the following to build the Tofino prototype code:
./p4_build.sh -p ~/sigcomm22-paper67-artifacts/prototype/p4rtt_tofino1.p4
Wait for the build to finish. It may take a few (usually between 1-4) minutes.
The next few steps will engage a terminal each, so please login from 4 different terminals or use tmux
to split your terminal.
- From inside the
~/bf-sde-9.7.0
directory, execute the following to start the Tofino model:
./run_tofino_model.sh -p p4rtt_tofino1
Wait until you see the message CLI listening on port 8000
. This terminal will now be engaged — please move to the next terminal.
- From inside the
~/bf-sde-9.7.0
directory, execute the following to start the Tofino switch-driver:
./run_switchd.sh -p p4rtt_tofino1
Wait until you the bfshell>
shell has been activated. This terminal will now be engaged — please move to the next terminal.
- From any directory, execute the following to start capturing the outcoming packets on the switch interface
veth8
:
sudo tcpdump -i veth8 -w ~/sigcomm22-paper67-artifacts/output_traces/attack_rtts.pcap
The output RTT samples from our deployed P4 code will now be saved to this pcap
file.
This terminal will now be engaged — please move to the next terminal.
- From any directory, execute the following to replay the interception attack trace on the switch interface
veth0
:
sudo tcpreplay -i veth0 -p 100 ~/sigcomm22-paper67-artifacts/pcaps/interception_attack_trace.pcap
Wait until tcpreplay
has finished executing.
- Once
tcpreplay
has exited, pressCtrl+C
in the terminal wheretcpdump
was running. All the RTT output data is now saved in thepcap
file~/sigcomm22-paper67-artifacts/output_traces/attack_rtts.pcap
.
In order to allow other AEC evaluators to execute our code without issues, please exit the running processes.
-
Please press
Ctrl+C
on the terminal whererun_tofino_model.sh
was running. Wait until the process has exited. -
Performing 1 should also kill the process on the terminal where
run_switchd.sh
was running. You should see the messagebfshell> Receive failed
. Please double-check that this is indeed the case. -
Double-check that the
tcpreplay
andtcpdump
processes have also exited.
- Please execute the following command to generate the interception-attack-detection plot from the saved output trace file
~/sigcomm22-paper67-artifacts/output_traces/attack_rtts.pcap
:
python3 ~/sigcomm22-paper67-artifacts/prototype/plot_rtt_samples_interception_attack.py ~/sigcomm22-paper67-artifacts/output_traces/attack_rtts.pcap
The resulting plot will be located in ~/sigcomm22-paper67-artifacts/plots/interception_attack_rtts.pdf
.
- Download the generated plot to your local system to view it by executing:
scp -i ~/.ssh/sigcomm22-paper67-aws-key.pem ubuntu@ec2-54-82-111-53.compute-1.amazonaws.com:~/sigcomm22-paper67-artifacts/plots/interception_attack_rtts.pdf <local_system_path>
The simulation code is present in simulations
.
This code can be used to obtain plots equivalent to Figures 9
—14
, by feeding pcaps/smallFlows.pcap
as the input.
The following steps take the evaluator through this process to generating the relevant plots from the network trace file.
- Please ensure that you are logged in to the EC2 instance. From a terminal, execute the following command to navigate to the correct directory for the simulations:
cd ~/sigcomm22-paper67-artifacts/simulations
- Preprocess the network trace by executing the following command on the terminal:
python3 preprocess_trace.py ../pcaps/smallFlows.pcap intermediate/smallFlows.pickle
The pre-processed trace is now saved in the intermediate file smallFlows.pickle
.
We will use this pre-processed file to run our simulations.
- Generate all the TCP RTTs from the
smallFlows.pcap
network trace using thetcptrace
tool by executing the following command:
tcptrace -nrlZ --output_dir=intermediate/rtts ../pcaps/smallFlows.pcap > intermediate/tcptrace_nlrZ.txt
The RTT samples are written in individual files inside intermediate/rtts
.
- Parse the TCP RTTs from the
tcptrace
output by executing the following command:
python3 aggregate_tcptrace_rtts.py
The RTT samples are now cached in Python pickle files inside intermediate
.
- Execute simulations for a version of Dart with infinite memory by executing:
python3 run_simulations_infinite_memory.py
- Generate plots equivalent to
Figures 9
—11
of the paper by executing:
python3 plot_tcptrace_infinite.py
- Download the generated plots to your local system to view them by executing:
scp -i ~/.ssh/sigcomm22-paper67-aws-key.pem ubuntu@ec2-54-82-111-53.compute-1.amazonaws.com:~/sigcomm22-paper67-artifacts/plots/figure_*_equivalent.pdf <local_system_path>
- Execute the following command to generate a figure equivalent to
Figure 12
in the paper:
python3 rtt_analysis_wnwo_handshakes.py
The code reports statistics regarding connections, handshakes, and the count of RTT samples and generates a plot equivalent to Figure 12
in the paper.
- Download the generated plot to your local system to view it by executing:
scp -i ~/.ssh/sigcomm22-paper67-aws-key.pem ubuntu@ec2-54-82-111-53.compute-1.amazonaws.com:~/sigcomm22-paper67-artifacts/plots/figure_12_equivalent.pdf <local_system_path>
Please note that in our original 15 mins. campus trace, we see a high number of unsuccessful handshakes due to SYN flooding attempts.
We do not see such a phenomenon in the smallFlows.pcap
file. As such, the number of unsuccessful handshakes is actually zero
.
Instead of reporting this on the plot, we report the number of missing handshakes, i.e., the number of connections where the handshakes were never seen, because the trace capture started after these handshakes were already complete.
This is only for completion and to show an interesting fact about smallFlows.pcap
.
The main point Figure 12
makes in the paper is that the percentage of handshake RTTs is sufficiently low such that we can avoid collecting them without any significant penalty.
The current plot shows that this holds true even for smallFlows.pcap
(only ~13% of all RTTs are handshake RTTs).
- Execute the following command to perform simulations for Dart with different PT memory sizes:
python3 run_dart_simulations_batch_pt_memory.py
- Execute the following command to perform simulations for Dart with different number of PT stages:
python3 run_dart_simulations_batch_pt_stages.py
- Execute the following commands to generate figures equivalent to
Figures 13
and14
in the paper:
python3 plot_pt_error_rate.py
- Download the generated plots to your local system to view them by executing:
scp -i ~/.ssh/sigcomm22-paper67-aws-key.pem ubuntu@ec2-54-82-111-53.compute-1.amazonaws.com:~/sigcomm22-paper67-artifacts/plots/figure_13_equivalent.pdf <local_system_path>
scp -i ~/.ssh/sigcomm22-paper67-aws-key.pem ubuntu@ec2-54-82-111-53.compute-1.amazonaws.com:~/sigcomm22-paper67-artifacts/plots/figure_14_equivalent.pdf <local_system_path>
Please clean up after yourself so that other AEC evaluators can start from scratch, and can perform all the above steps without needing to care about existing files/plots.
We provide the script cleanup.sh
for this purpose.
- Please ensure you are in the parent git directory by executing:
cd ~/sigcomm22-paper67-artifacts
- Please execute the cleanup script by executing:
./cleanup.sh