shreyas-kowshik / ILStrudel.jl

Code for the paper "ILStrudel : Independence Based Learning of Structured-Decomposable Probabilistic Circuit Ensembles" accepted at the TPM Workshop, UAI'21

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ILStrudel

Code for the paper : ILStrudel : Independence-Based Learning of Structured-Decomposable Probabilistic Circuit Ensembles accepted at the Tractable Probabilistic Models Workshop, UAI'21.

Poster | Video

Instructions To Run

By default the code uses the GPU to speed-up pairwise-mutual-information computation.

Train on single dataset :

julia example_train.jl --name [dataset_name] --run_name [exp_name] --pseudocount 1.0 --maxiter 300 --pmi_thresh 0.03 --population_size 300 --num_mine_samples 7 --mine_iterations 3 --num_mi_bags 20 --seed 63

$HOME_DIR is used to refer to output of homedir() in a julia program.

This will write all outputs to $HOME_DIR/runs/$run_name/$dataset_name.

To change the $HOME_DIR path to something else, change the $LOG_DIR variable in line 62 in example_train.jl to a custom path.

To generate summary :

julia example_summary.jl --logdir $HOME_DIR/runs/$run_name

This will generate two files runs.csv and summary.csv under $HOME_DIR/runs/$run_name.

runs.csv will hold performance of all the different hyperparameter configurations for each dataset.

summary.csv will hold the best performing result of all the runs for each dataset.

Train on all datasets :

python train.py

This will launch 20 different tmux shells running in the background on each of the 20 datasets.

Change the parameters inside train.py according to the required experiment names and hyperparameters.

About

Code for the paper "ILStrudel : Independence Based Learning of Structured-Decomposable Probabilistic Circuit Ensembles" accepted at the TPM Workshop, UAI'21


Languages

Language:Julia 97.8%Language:Python 2.2%