Proof-of-Concept implementation of the DNN to finite automata translation proposed in the paper Verifying and Interpreting Neural Networks Using Finite Automata.
This repository represents a proof-of-concept implementation for translating neural networks with linear layers and ReLU activations into equivalent finite automata. In particular, this project offers a set of tools to translate Binarized Neural Networks (BNN) to Nondeterministic Finite Automata (NFA) capturing the input-output behaviour of the BNN over all integer inputs.
Use the package manager pip to install the requirements using
pip install -r requirements.txt
The project assumes that Python 3.10 is used. After this you can use the project by starting any of the
skripts given in .\scripts\
. For further infos on these skripts read down below.
A simple example script explaining our neural network format .toynnet
is found in
.\scripts\introductory_example.py
.
To generate the charts presented in the paper see the script .\scripts\generate_charts.py
.
The two parts of the benchmark presented in the paper can be generated by:
- Part A: The script
\scripts\benchmark_runtime_total_states+transitions.py
generates the data indicating the exponential blowup in the proposed translation from BNN to NFA. - Part B: The script
\scripts\benchmark_size_reduction.py
generates data indicating the crucial effect of automata minimisation while translating BNN into NFA. Note that the results will differ slightly from the ones presented in the paper. This is due to the nondeterministic performance of the minimisation algorithm.
In both parts, the resulting data is printed onto the console.