A handy project skeleton for data analysis projects using IPython notebook. Includes boilerplate for writing external modules and testing.
Austin G. Davis-Richardson
I like IPython Jupyter notebook for interactive computing/data analysis but
I find it difficult to write maintainable code.
This is an effort to promote writing better, re-usable code as seperate Python modules, with continuous integration using Travis, and also verify that notebooks build without error (I guess this is a form of integration testing). It is importanant that you never commit your notebooks to the master branch while they're in a failing state.
Install dependencies
# create and activate a new virtual environment
pyenv virtualenv 2.7.9 $PWD
pyenv activate $PWD
# install dependencies
pip install -r requirements.txt
# start notebook
ipython notebook
Write functions common to all notebooks in lib/common.py
to start. Use
doctests for testing.
To run unit tests:
nosetests --with-doctests
To build notebooks:
# builds HTML by default
make
# or, build notebooks in parallel
make -j
(Travis will do both of these things and fail if you commit a notebook in a non-functioning state)
make
will build any and all notebooks in $PWD
.
Plz do! I ❤️ pull requests.
- Experiment with sending output from notebooks built by Travis (ideally on GitHub or IPyNbViewer).