A python toolbox for uncertainty quantification and sensitivity analysis tailored towards computational neuroscience.
Uncertainpy is a python toolbox for uncertainty quantification and sensitivity analysis of computational models and features of the models.
Uncertainpy is model independent and treats the model as a black box where the model can be left unchanged. Uncertainpy implements both quasi-Monte Carlo methods and polynomial chaos expansions using either point collocation or the pseudo-spectral method. Both of the polynomial chaos expansion methods have support for the rosenblatt transformation to handle dependent input parameters.
Uncertainpy is feature based, i.e., if applicable, it recognizes and calculates the uncertainty in features of the model, as well as the model itself. Examples of features in neuroscience can be spike timing and the action potential shape.
Uncertainpy is tailored towards neuroscience models, and comes with several common neuroscience models and features built in, but new models and features can easily be implemented. It should be noted that while Uncertainpy is tailored towards neuroscience, the implemented methods are general, and Uncertainpy can be used for many other types of models and features within other fields.
Examples for how to use Uncertainpy can be found in the examples folder as well as in the documentation. Here we show an example, found in examples/coffee_cup, where we examine the changes in temperature of a cooling coffee cup that follows Newton’s law of cooling:
This equation tells how the temperature of the coffee cup changes with time , when it is in an environment with temperature . is a proportionality constant that is characteristic of the system and regulates how fast the coffee cup radiates heat to the environment. For simplicity we set the initial temperature to a fixed value, , and let and be uncertain input parameters.
We start by importing the packages we use:
import uncertainpy as un
import numpy as np # For the time array
import chaospy as cp # To create distributions
from scipy.integrate import odeint # To integrate our equation
To create the model we define a Python function coffee_cup
that
takes the uncertain parameters kappa
and T_env
as input arguments.
Inside this function we solve our equation by integrating it using
scipy.integrate.odeint
,
before we return the results.
The implementation of the model is:
# Create the coffee cup model function
def coffee_cup(kappa, T_env):
# Initial temperature and time array
time = np.linspace(0, 200, 150) # Minutes
T_0 = 95 # Celsius
# The equation describing the model
def f(T, time, kappa, T_env):
return -kappa*(T - T_env)
# Solving the equation by integration.
temperature = odeint(f, T_0, time, args=(kappa, T_env))[:, 0]
# Return time and model output
return time, temperature
We could use this function directly in UncertaintyQuantification
,
but we would like to have labels on the axes when plotting.
So we create a Model
with the above run function and labels:
# Create a model from the coffee_cup function and add labels
model = un.Model(run=coffee_cup, labels=["Time (min)", "Temperature (C)"])
The next step is to define the uncertain parameters. We give the uncertain parameters in the cooling coffee cup model the following distributions:
We use Chaospy to create the distributions, and create a parameter dictionary:
# Create the distributions
kappa_dist = cp.Uniform(0.025, 0.075)
T_env_dist = cp.Uniform(15, 25)
# Define the parameter dictionary
parameters = {"kappa": kappa_dist, "T_env": T_env_dist}
We can now calculate the uncertainty and sensitivity using polynomial chaos
expansions with point collocation,
which is the default option of quantify
:
# Set up the uncertainty quantification
UQ = un.UncertaintyQuantification(model=model,
parameters=parameters)
# Perform the uncertainty quantification using
# polynomial chaos with point collocation (by default)
data = UQ.quantify()
Here you see an example on how the results might look:
This plot shows the mean, variance, and 90% prediction interval (A), and the first-order Sobol indices (B), which shows the sensitivity of the model to each parameter, for the cooling coffee cup model. As the mean (blue line) in A shows, the cooling gives rise to an exponential decay in the temperature, towards the temperature of the environment . From the sensitivity analysis (B) we see that T is most sensitive to early in the simulation, and to towards the end of the simulation. This is as expected, since determines the rate of the cooling, while determines the final temperature. After about 150 minutes, the cooling is essentially completed, and the uncertainty in T exclusively reflects the uncertainty of .
The documentation for Uncertainpy can be found at http://uncertainpy.readthedocs.io, and the Uncertainpy paper here: Tennøe S, Halnes G and Einevoll GT (2018) Uncertainpy: A Python Toolbox for Uncertainty Quantification and Sensitivity Analysis in Computational Neuroscience. Front. Neuroinform. 12:49. doi: 10.3389/fninf.2018.00049.
Uncertainpy works with both Python 2 and 3. Uncertainpy can easily be installed using pip. The minimum install is:
pip install uncertainpy
To install all requirements you can write:
pip install uncertainpy[all]
Specific optional requirements can also be installed, see below for an explanation. Uncertainpy can also be installed by cloning the Github repository:
$ git clone https://github.com/simetenn/uncertainpy
$ cd /path/to/uncertainpy
$ python setup.py install
setup.py
are able to install different set of dependencies.
For all options run::
$ python setup.py --help
Uncertainpy has the following dependencies:
chaospy
tqdm
h5py
multiprocess
numpy
scipy
seaborn
matplotlib
xvfbwrapper
six
SALib
These are installed with the minimum install.
xvfbwrapper
requires xvfb
, which can be installed with:
sudo apt-get install xvfb
Two different fileformats (backends) can be chosen to save the results in.
HDF5
is chosen (and installed) by default, but the Exdir
format can be
chosen as well. If you want to use Exdir
it can be installed by:
conda install -c cinpla exdir
Currently Exdir
only supports Python 2.7 in an experimental branch.
To get the Python 2.7 version of exdir,
run the above command to install all dependencies. Then clone the Exdir Github
repository, change to the python 2.7 branch and then install from the source:
git clone https://github.com/CINPLA/exdir.git
cd exdir
git checkout python27
python setup.py install
Additionally Uncertainpy has a few optional dependencies for specific classes of models and for features of the models.
uncertainpy.EfelFeatures
requires the Python package
efel
which can be installed with:
pip install uncertainpy[efel_features]
or:
pip install efel
uncertainpy.NetworkFeatures
requires the Python packages
elephant
neo
quantities
which can be installed with:
pip install uncertainpy[network_features]
or:
pip install elephant, neo, quantities
uncertainpy.NeuronModel
requires the external simulator
Neuron (with Python),
a simulator for neurons. Neuron must be installed by the user.
uncertainpy.NestModel
requires the external simulator
Nest (with Python),
a simulator for network of neurons.
Nest must be installed by the user.
Uncertainpy comes with an extensive test suite that can be run with the test.py
script.
For how to use test.py run:
$ python test.py --help
test.py
has all dependencies of Uncertainpy in addition to:
click
These can be installed with pip:
pip install uncertainpy[tests]
If you use Uncertainpy in your work, please cite: Tennøe S, Halnes G and Einevoll GT (2018) Uncertainpy: A Python Toolbox for Uncertainty Quantification and Sensitivity Analysis in Computational Neuroscience. Front. Neuroinform. 12:49. doi: 10.3389/fninf.2018.00049.