Library for getting your data into HEPData
- Documentation: https://hepdata-lib.readthedocs.io
This code works with Python 3.6, 3.7, 3.8, 3.9, 3.10, 3.11 or 3.12.
It is highly recommended you install hepdata_lib
into a virtual environment.
python -m pip install hepdata_lib
Alternatively, install from conda-forge using a conda
ecosystem package manager:
conda install --channel conda-forge hepdata-lib
If you are not sure about your Python environment, please also see below how to use hepdata_lib
in a Docker or Apptainer container.
The use of Apptainer is recommended when working on typical HEP computing clusters such as CERN LXPLUS.
For using hepdata_lib
, you don't even need to install it, but can use the binder or SWAN (CERN-only) services using one of the buttons below:
You can also use the Docker image (recommended when working on local machine):
docker run --rm -it -p 8888:8888 -v ${PWD}:/home/hepdata ghcr.io/hepdata/hepdata_lib:latest
And then point your browser to http://localhost:8888 and use the token that is printed out. The output will end up in your current working directory (${PWD}
).
If you prefer a shell, instead run:
docker run --rm -it -p 8888:8888 -v ${PWD}:/home/hepdata ghcr.io/hepdata/hepdata_lib:latest bash
If on CERN LXPLUS or anywhere else where there is Apptainer available but not Docker, you can still use the docker image.
If CVMFS (specifically /cvmfs/unpacked.cern.ch/
) is available:
export APPTAINER_CACHEDIR="/tmp/$(whoami)/apptainer"
apptainer shell -B /afs -B /eos /cvmfs/unpacked.cern.ch/ghcr.io/hepdata/hepdata_lib:latest
If CVMFS is not available:
export APPTAINER_CACHEDIR="/tmp/$(whoami)/apptainer"
apptainer shell -B /afs -B /eos docker://ghcr.io/hepdata/hepdata_lib:latest bash
Unpacking the image can take a few minutes the first time you use it. Please be patient. Both EOS and AFS should be available and the output will be in your current working directory.
There are a few more examples available that can directly be run using the binder links below or using SWAN (CERN-only, please use LCG release LCG_94 or later) and selecting the corresponding notebook manually:
- Reading in text files
- Reading in a CMS combine ntuple
- Reading in ROOT histograms
- Reading a correlation matrix
- Reading TGraph and TGraphError from '.C' files
- Preparing scikit-hep histograms
Make sure that you have ROOT
in your $PYTHONPATH
and that the convert
command is available by adding its location to your $PATH
if needed.
A ROOT installation is not strictly required if your input data is not in a ROOT format, for example, if
your input data is provided as text files or scikit-hep/hist
histograms. Most of the hepdata_lib
functionality can be used without a ROOT installation, other than the RootFileReader
and CFileReader
classes,
and other functions of the hepdata_lib.root_utils
module.