fractopo
is a Python module that contains tools for validating and
analysing lineament and fracture trace maps (fracture networks).
- Full Documentation is hosted on Read the Docs:
pip
and poetry
installation only supported for linux
and
MacOS
based operating systems. For Windows
install using
(ana)conda
.
For pip
and poetry
: Omit --dev or [dev] for regular
installation. Keep if you want to test/develop or otherwise install all
development python dependencies.
- Only supported installation method for
Windows
!
# Create new environment for fractopo (recommended)
conda env create fractopo-env
conda activate fractopo-env
# Available on conda-forge channel
conda install -c conda-forge fractopo
The module is on PyPI.
# Non-development installation
pip install fractopo
Or locally for development:
git clone https://github.com/nialov/fractopo
cd fractopo
# Omit [dev] from end if you do not want installation for development
pip install --editable .[dev]
For usage:
poetry add fractopo
For development:
git clone https://github.com/nialov/fractopo --depth 1
cd fractopo
poetry install
fractopo
has two main use cases:
- Validation of lineament & fracture trace data
- Analysis of lineament & fracture trace data
Validation is done to make sure the data is valid for the analysis and is crucial as analysis cannot take into account different kinds of geometric and topological inconsistencies between the traces.
Reading and writing spatial filetypes is done in geopandas
and you
should see geopandas
documentation for more advanced read-write use
cases:
Simple example with trace and area data in GeoPackages:
import geopandas as gpd
# Trace data is in a file `traces.gpkg` in current working directory
# Area data is in a file `areas.gpkg` in current working directory
trace_data = gpd.read_file("traces.gpkg")
area_data = gpd.read_file("areas.gpkg")
Trace and target area data can be validated for further analysis with a
Validation
object.
from fractopo import Validation
validation = Validation(
trace_data,
area_data,
name="mytraces",
allow_fix=True,
)
# Validation is done explicitly with `run_validation` method
validated_trace_data = validation.run_validation()
Trace validation is also accessible as a command-line script,
fractopo tracevalidate
which is more straightforward to use than through
Python calls. Note that all subcommands of fractopo
are available by
appending them after fractopo
.
tracevalidate
always requires the target area that delineates trace
data.
# Get full up-to-date script help
fractopo tracevalidate --help
# Basic usage example:
fractopo tracevalidate /path/to/trace_data.shp /path/to/target_area.shp\
--output /path/to/validated_trace_data.shp
# Or with automatic saving to validated/ directory
fractopo tracevalidate /path/to/trace_data.shp /path/to/target_area.shp\
--summary
Trace and target area data (GeoDataFrames
) are passed into a
Network
object which has properties and functions for returning and
visualizing different parameters and attributes of trace data.
from fractopo import Network
# Initialize Network object and determine the topological branches and nodes
network = Network(
trace_data,
area_data,
# Give the Network a name!
name="mynetwork",
# Specify whether to determine topological branches and nodes
# (Required for almost all analysis)
determine_branches_nodes=True,
# Specify the snapping distance threshold to define when traces are
# snapped to each other
snap_threshold=0.001,
# If the target area used in digitization is a circle, the knowledge can
# be used in some analysis
circular_target_area=True,
# Analysis on traces can be done for the full inputted dataset or the
# traces can be cropped to the target area before analysis (cropping
# recommended)
truncate_traces=True,
)
# Properties are easily accessible
# e.g.
network.branch_counts
network.node_counts
# Plotting is done by plot_ -prefixed methods
network.plot_trace_lengths()
Network analysis is also available as a command-line script but using the
Python interface (e.g. jupyter lab
, ipython
) is recommended when
analysing Networks
to have access to all available analysis and plotting
methods. The command-line entrypoint is opinionated in what outputs it
produces. Brief example of command-line entrypoint:
fractopo network /path/to/trace_data.shp /path/to/area_data.shp\
--name mynetwork
# Use --help to see all up-to-date arguments and help
fractopo network --help
See full documentation for more examples and help:
To cite this software:
- The software is introduced in https://doi.org/10.1016/j.jsg.2022.104528 and you can cite that article as a general citation:
Ovaskainen, N., Nordbäck, N., Skyttä, P. and Engström, J., 2022. A new
subsampling methodology to optimize the characterization of
two-dimensional bedrock fracture networks. Journal of Structural Geology,
p.104528.
- To cite a specific version of
fractopo
you can use azenodo
provideddoi
. E.g. https://doi.org/10.5281/zenodo.5957206 for versionv0.2.6
. See thezenodo
page offractopo
for thedoi
of each version: https://doi.org/10.5281/zenodo.5517485
- Breaking changes are possible and expected.
Development dependencies for fractopo
include:
poetry
- Used to handle Python package dependencies.
# Use poetry run to execute poetry installed cli tools such as invoke, # nox and pytest. poetry run <cmd>
doit
- A general task executor that is a replacement for a
Makefile
- Understands task dependencies and can run tasks in parallel even while running them in the order determined from dependencies between tasks. E.g. requirements.txt is a requirement for running tests and therefore the task creating requirements.txt will always run before the test task.
# Tasks are defined in dodo.py # To list doit tasks from command line poetry run doit list # To run all tasks in parallel (recommended before pushing and/or # committing) # 8 is the number of cpu cores, change as wanted # -v 0 sets verbosity to very low. (Errors will always still be printed.) poetry run doit -n 8 -v 0
- A general task executor that is a replacement for a
nox
nox
is a replacement fortox
. Both are made to create reproducible Python environments for testing, making docs locally, etc.
# To list available nox sessions # Sessions are defined in noxfile.py poetry run nox --list
copier
copier
is a project templater. Many Python projects follow a similar framework for testing, creating documentations and overall placement of files and configuration.copier
allows creating a template project (e.g. https://github.com/nialov/nialov-py-template) which can be firstly cloned as the framework for your own package and secondly to pull updates from the template to your already started project.
# To pull copier update from github/nialov/nialov-py-template poetry run copier update
pytest
pytest
is a Python test runner. It is used to run defined tests to check that the package executes as expected. The defined tests in./tests
contain many regression tests (done withpytest-regressions
) that make it almost impossible to add features tofractopo
that changes the results of functions and methods.
# To run tests implemented in ./tests directory and as doctests # within project itself: poetry run pytest
coverage
# To check coverage of tests # (Implemented as nox session!) poetry run nox --session test_pip
sphinx
- Creates documentation from files in
./docs_src
.
# To create documentation # (Implemented as nox session!) poetry run nox --session docs
- Creates documentation from files in
Big thanks to all maintainers of the above packages!
Copyright © 2020, Nikolas Ovaskainen.