LSDtopotools / lsdfailtools

A collection of tools for combining satellite-derived ground motion data with slope stability models.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

FORESEE landslide and ground motion tools

Python software for predicting landslide failures based on precipitation, ground motion data and groundwater pressure. The main outputs of this model are the identified failure locations, the timing of the failure, depth of failure and factor of safety. It makes use of lsdfailtools, a set of python-c++ tools that implement the landslide model from Iverson (2000).

Basic overview and workflow

To obtain failure locations and times, the following steps are needed.

  1. Download or build the docker container.

  2. Run the installation script withing the container.

  3. Collect required data: a DEM of the area, a shapefile of the infrastructure corridor of interest, rainfall data (this can be collected using a script provided), peizometer data (where available) and ground motion data derived from radar satellites.

  4. Run data pre-processing routines if required.

  5. Run calibration step if required.

  6. Run validation and/or prediction script.

  7. Collect model outputs.

These steps are descrbed in full below.

Installation

These software scripts and packages are a combination of python and c++. The simplest way to use these tools is via our docker container, which has all the necessary python packages installed, and includes a startup script.

First, you need to install Docker on your machine using the following link https://docs.docker.com/get-docker/ (note that these instructions may need administrator privileges to run properly)

You then need to create a directory on your host computer that is visible to the docker container. In the instructions below, we use C:\LSDTopoTools but you can replace this with any directory you wish.

You can then install the lsdfailtools container one of two ways:

To build from DockerHub (recommended):

  1. Pull the docker container from docker hub: docker pull lsdtoptools/lsdtt_failtools_docker

  2. Create a folder into which to put all the files, then run:

  3. docker run --rm -it -v C:/LSDTopoTools:/LSDTopoTools dockerhub/dockerhub_info

  4. This should pull the repository along with setting up your Docker container ready for analysis

  5. You should now be inside the Docker container at the top directory of your cloned repo ready to begin!

  6. To escape your docker session at any point use ctrl-d

  7. Now you need to install the software. Run the following commands:

    python setup.py bdist_wheel
    pip install dist/XXX.whl
  8. Where XXX is the name of the wheel generated by the python line!

  9. Rather than navigating to dist/ to check the name you can just autocomplete this using tab autocomplete e.g. type pip install dis then hit tab before enter

  10. Everything is now ready to run, note you will need to follow these steps everytime you want to use the software.

To build locally from a local Dockerfile

  1. Download the Dockerfile from https://github.com/LSDtopotools/lsdtt_failtools_docker

  2. Place the Dockerfile in a directory then navigate there via the command line and run:

  3. docker build --tag lsdtt_failtools_docker .

After you have built the container

  1. Ensure that you have already cloned the lsdfailtools repository somewhere on your machine either using git from the command line or by downloading from the github website directly and unzipping it (https://github.com/LSDtopotools/lsdfailtools.git)

  2. Then run the command

    sudo docker run -it -v /path/to/your/cloned/repo:/LSDTopoTools -e NASA_USERNAME="username" -e NASA_PASSWORD="password"

  3. Note: this requires a an account on the NASA EartData website (https://urs.earthdata.nasa.gov), make a login and password, click in Applications>Authorized Apps> Approve More Applications and select NASA GESDISC DATA ARCHIVE. This will be used in the PRECIPITATION section for downloading precipitaiton data.

MODEL DATA INPUTS

You must collect the following data before the model can be run.

Piezometer data:

.csv file with piezometer readings and the information about where they are located. They must be obtained from on-site locations or purchased.

Piezometer reading example (.csv file):

ID=unique identifier

DATE = reading date

READING NUMBER = progressive number of reading with time

FF = Depth of hole bottom

LIV = Depth of water from ground level - When the reading is 'dry' the number 999 is used

ID,DATE,READ_NUM,FF1,LIV1,FF2,LIV2,FF3,LIV3,FF4,LIV4
7,23/10/2014,0,3.7,3.5,6.2,999,6.1,999,9.6,5.5
7,24/06/2016,1,3.5,3.5,11.9,7.4,18.5,7.8,9.8,3.7
7,15/05/2017,2,3.5,3.5,11.8,7.4,18.1,7.4,9.8,3.8

Piezometer location information example (.csv file):

ID=unique identifier

NAME = Piezometer name

PROGR_KM = KP of A16 motorway from "Autostrade"

CARR = Carriageway (East or West) from "Autostrade"

LONGITUDE, LATITUDE = coordinates in decimal degrees in WGS84

ELEVATION = elevation in meters from "Autostrade"

ID,NAME,PROGR_KM,Carr,LONGITUDE,LATITUDE,ELEVATION,LENGTH
1,PzA,89+500,E,15.145169,41.090146,440,25.6
2,Pz4,89+500,E,15.144353,41.090037,447,23.8
3,Pz23,89+500,E,15.14341,41.088143,476,29.9

Precipitation data:

Obtained from the Global Precipitation Measurement Mission by NASA, which is freely available online but requires the creation of a free account in their website.

If alternative data sources are to be used instead, they must be in a .csv file, with columns indicating the duration of precipitation (s) and the precipitation intensity (mm/s).

Example:

duration_s,intensity_mm_sec
86400,0
86400,2.26E-07
86400,1.99E-06
86400,8.75E-07

Sentinel interferometry

Shapefile containing a time series of displacement derived from an ISBAS analysis of Sentinel-1 images. Projection: UTM Zone 33N Datum WGS84 (EPGS 32633).

The fields contained in the shapefile are:

  • EASTING, Easting coordinate in UTM Zone32N Datum WGS84 [m]
  • NORTHING, Northing coordinate in UTM Zone32N Datum WGS84 [m]
  • VEL, mean velocity of displacement [mm/year]
  • STDDEV_VEL, standard deviation of the mean velocity of displacement [mm/year]
  • Dyyyymmdd, displacement per date, where yyyy is the year, mm is the month and dd is the day of measure [mm]

Cosmo-SKYMed interferometry

Shapefiles containing a time series of displacement derived from PSP-IfSAR analysis of COSMO-SkyMed images. Ascending, descending, East-West and Vertical components required. Projection: UTM Zone 33N Datum WGS84 (EPGS 32633).

The fields contained in the shapefile are:

  • CODE, unique measure identification
  • HEIGHT, height of the measure [m]
  • H_STDEV, standard deviation of the height [m]
  • VEL, mean velocity of displacement [mm/year]
  • V_STDEV, standard deviation of the mean velocity of displacement [mm/year]
  • ACCEL, acceleration of the time series [mm/year2]
  • COHE, quality of the measure between 0 (low) and 1 (hight)
  • Dyyyymmdd, displacement per date, where yyyy is the year, mm is the month and dd is the day of measure [mm]

Area of Interest:

Polygon shapefile outlining the area of interest where the calibration and validation points will be sampled from. Required projection: EPSG 4326

Digital Elevation Model (DEM)

.bil raster file containing the elevation of the area of interest with coordinate system in EPSG:32633 (UTM zone 33N). In our case we use the EU-DEM 25 m which is obtained freely from the Copernicus Land Monitoring Service https://www.eea.europa.eu/data-and-maps/data/copernicus-land-monitoring-service-eu-dem. Alternatively, a 10m DEM can be found on the Tinitaly website http://tinitaly.pi.ingv.it/. However, other resolutions are also accepted by the model, although it will affect the calibration and validation process.

Topographic slope:

.bil raster file containing the slope of the area of interest with coordinate system in EPSG:32633 (UTM zone 33N). This file can be derived from the DEM using ArcMap or open source software such as LSDTopoTools.

Road:

line shapefile with the outline of the road of interest for the study. Required projection: EPSG:32633 (UTM zone 33N).

Monte Carlo parameters:

.csv file with the ranges of the parameters used in the Monte Carlo simulation. The first row of values corresponds to the minimum values and the second row to the maximum values.

Example file:

D_0,K_sat,Iz_over_K_steady,friction_angle,cohesion,weight_of_water,weight_of_soil,depth
0.000001,0.00000001,0.1,0.2,5000,9800,15000,0.1
0.0001,0.000001,0.8,0.5,20000,9800,25000,3

Variable names and units:

PARAMETER NAME UNITS
HYDRAULIC DIFFUSIVITY (D_0) m^{2} s^{-1}
HYDRAULIC CONDUCTIVITY (K_sat) m s^{-1}
STEADY STATE (LONG TERM) WATER BALANCE (IZ/KZ) (Iz_over_K_steady) dimensionless
TANGENT OF FRICTION ANGLE dimensionless
COHESION Pa
COLUMN WEIGHT OF THE WATER (DENSITY TIME GRAVITATIONAL ACCELERATION) m^{-2} kg s^{-2}
VOLUME WEIGHT OF THE SOIL (DENSITY TIME GRAVITATIONAL ACCELERATION) m^{-2} kg s^{-2}
SOIL COLUMN DEPTH m

Calibration parameters:

.csv file including the number of Monte Carlo runs (Nruns), the maximum number of iterations of the Monte Carlo process (itermax), the number of points to calibrate (Num_cal), the start (StartDate) and end date (EndDate) of the timeseries which correspond to the length of the precipitation record, and the failure interval (failinterval) which is the accepted time window (in days) to simulate acceptable failure times.

Example file:

Nruns,itermax,Num_cal,StartDate,EndDate,failinterval
25,50,200,01/01/2014,31/12/2019,25

MODEL DATA OUTPUTS

Calibration .csv file

(see table below for example). Contains the calibrated parameter values for the calibrated pixels (location given by row,col) as well as the modelled time of failure, the factor of safety, the depth of failure and the observed failure time.

alpha,D_0,K_sat,d,Iz_over_K_steady,friction_angle,cohesion,weight_of_water,weight_of_soil,time_of_failure,factor_of_safety,min_depth,S,Z,row,col,observed_failtime
0,0.076776527,1.25E-05,3.16E-07,3.236842105,0.635793647,0.354652987,12032.7136,9800,19356.08113,97977600,-0.74747467,0.100000001,0.07677653,547.6984,369,562,96422400
1,0.272359937,7.71E-06,1.68E-07,3.236842105,0.629424281,0.263523691,9599.602129,9800.473225,11034.71452,70243200,0.567260742,0.100000001,0.27235994,441.05658,431,648,71539200
2,0.170809358,2.23E-06,6.16E-08,3.236842105,0.232207709,0.426850153,11188.8693,9800,17405.85364,114998400,0.303287506,0.100000001,0.17080936,528.0549,437,825,114393600

Validation .csv file (see table below for example).

Contains the validated parameter values for the validated pixels (location given by row,col) as well as the modelled time of failure, the factor of safety, the depth of failure and the observed failure time.

alpha,D_0,K_sat,d,Iz_over_K_steady,friction_angle,cohesion,weight_of_water,weight_of_soil,time_of_failure,factor_of_safety,min_depth,S,Z,row,col,observed_failtime,
0.050537445,4.64E-06,2.21E-08,3.236842105,0.241673545,0.200190446,12116.30719,9800.851942,16740.39976,100224000,-0.557540894,0.100000001,0.050537445,652.6312256,4,690,16588800
0.058242787,4.64E-06,2.21E-08,3.236842105,0.241673545,0.200190446,12116.30719,9800.851942,16740.39976,100224000,-0.467391968,0.100000001,0.058242787,655.8518066,5,687,87091200
0.034425307,1.45E-05,8.70E-08,3.236842105,0.136849459,0.291813387,17356.14574,9800,18178.94782,75254400,-2.77532959,0.100000001,0.034425307,684.3273315,14,770,91756800

Calibration shapefile

Multipolygon shapefile with the calibration points from the .csv file transformed into Voronoi polygons (using Voronoi tessellation). The attributes of each polygon are the time of failure, the factor of safety and the depth of failure. Coordinate system: EPSG:4326.

Validation shapefile

Multipolygon shapefile with the validation points from the .csv file transformed into Voronoi polygons (using Voronoi tessellation). The attributes of each polygon are the time of failure, the factor of safety and the depth of failure. Coordinate system: EPSG:4326.

Analyses

The analyses can be grouped into 5 main steps (see sections below). This involves processing the input data, calibrating the model, validating it, visualisation of the results and conversion of output .csv files into polygon shapefiles.

The analysis scripts are found in the python_foresee folder.

For a full analysis one would proceed through these steps in the sequence presented below.

ALLDATA_PROCESSING:

These scripts are used to pre-process data for ingestion into calibration and validation steps. Process all the input data: Piezometers, Precipitation, Sentinel and Cosmo-SKYMed interferometry data.

PIEZOMETERS

Data input:
  • Piezometer data: piezometer data and coordinates in .csv format.
Data output:
  • .shp with location and data of the piezometer
Instructions

Modify file_paths_piezometer.json to include paths to input and output directories. Then run the command:

python Make_shapefiles.py

PRECIPITATION

Data input:
  • Area of interest
Data output:
  • .csv file with the duration of precipitation events and the precipitation intensity.
Instructions

Please refer to the README.md file within the Precipitation folder for details of how to obtain the precipitation data. Below is an example of the command to be used:

python PPT_CMD_RUN.py --ProdTP GPM_30min --StartDate 2018-01-01 --EndDate 2018-12-31 --ProcessDir ~./mydirectory --SptSlc ~./boundary.shp --OP

COMBINED SENTINEL COSMO

Data input:
  • CosmoSkyMed InSAR data: Ascending, Descending, Vertical and EW.
  • Sentinel-1 InSAR data.
  • Topographic slope file.
Data output:
  • .bil file with the failing pixels and the time of failure from the combination of the AD and the EWV components of Cosmo SkyMed and Sentinel-1 data.
Instructions

Modify file_paths_combined_sentinel_cosmo.json to include paths to input and output directories. Then run the command:

python Process_combo.py

Important note on ground motion data

NOTE: If the user does not require the Sentinel and the Cosmo-SkyMed data to be combined (as per step 2.) and instead only one of the two data sources are to be included for the calibration and the validation process, follow steps 4. or 5. accordingly. This output data will substitute consequent data inputs where InSAR data is required in Calibration, Validation or Visualisation processes.

InSAR_SENTINEL

Data input:
  • Sentinel-1 InSAR data
  • DEM file
Data output:
  • .bil file with the failing pixels and the time of failure.
Instructions

Modify file_paths_insar_sentinel.json to include paths to input and output directories. Then run the command:

python Process_sentinel.py

InSAR_CSK

Data input:
  • CosmoSkyMed InSAR data: Ascending(A), Descending(D), Vertical(V) and East-West(W).
  • DEM file of the area of interest.
Data output:
  • .bil file with the failing pixels and the time of failure from the combination of the AD and the EWV components.
Instructions

Modify file_paths_insar_csk.json to include paths to input and output directories. Then run the command:

python Process_insar_AD.py
python Process_insar_EWV.py
python Combine_insar.py

CALIBRATION

Data input

  • Calibration parameters
  • Monte Carlo parameters
  • Ground Motion InSAR Failure data (.bil format). This is the output from Process_combo.py.
  • DEM
  • Topographic slope
  • Road file
  • Piezometer data
  • Rainfall Intensity data

Data output:

  • .csv file with the observed and the modelled time of failure, the pixel positions, the factor of safety and the depth of failure. It also includes the chosen parameter values for each point.

Instructions

Modify file_paths_calibration.json to include paths to input and output directories. Then run the command:

python Run_calibration.py

VALIDATION

Data input:

  • Calibration parameters
  • Monte Carlo parameters
  • Ground Motion InSAR Failure data (.bil format). This is the output from Process_combo.py.
  • DEM
  • Topographic slope: .bil file of the slope values in the area of interest with EPSG:32633
  • Road file
  • Piezometer
  • Calibrated points (.csv file). This is the output from Run_calibration.py.
  • Rainfall Intensity

Data output:

  • .csv file with the observed and the modelled time of failure, the pixel positions, the factor of safety and the depth of failure. It also includes the chosen parameter values for each point.

Instructions

Modify file_paths_validation.json to include paths to input and output directories. Then run the command:

python Run_validation.py

VISUALISATION

Data input:

  • Ground Motion InSAR Failure data (.bil format). This is the output from Process_combo.py.
  • DEM
  • Topographic slope
  • Road file
  • Calibrated points: .csv file. This is the output from Run_calibration.py
  • Validated points: .csv file. This is the output from Run_validation.py
  • Rainfall Intensity
  • Calibration parameters

Data output:

  • Map of calibrated points
  • Distribution of failure times, showing calibration and validation points as well as the precipitation record.
  • Plots showing the distribution of parameters with respect to height and elevation.
  • Map of the validated and calibrated points. The points indicate where failure happens and whether it was predicted before, after or within a 25-day window of the observed failure.
  • Zoomed-in version of the map detailed above.
  • Map of validated points with a colourbar indicating the exact number of days between the observed failure events and the modelled failure event.
  • Plot of the rainfall data as a function of time.
  • Plot of the rainfall data along with the calibrated failure points as a function of time.
  • Probability density function and histogram of the temporal distribution of failures both for the modelled and the observed failure events.
  • Violin plot with the temporal distribution of modelled failures split into time intervals (violins) with respect to observed failure time distribution.

Instructions

Modify file_paths_visualisation.json to include paths to input and output directories.

Then run the commands:

python Final_outputs_visualisation.py
python map_validation.py
python map_validation_zoom.py
python map_validation_colourbar.py

VORONOI TESSELLATION

Data input:

  • Ground Motion InSAR Failure data (.bil format). This is the output from Process_combo.py.
  • DEM
  • Area of interest (.shp) in EPSG 4326
  • Calibrated points: .csv file. This is the output from Run_calibration.py
  • Validated points: .csv file. This is the output from Run_validation.py

Data output:

  • Shapefile with Voronoi polygons of the points obtained from the calibration and/or the validation phases.

Instructions

To convert the .csv output from the validation (or calibration) file into a point shapefile each attribute that needs to be included must be processed separately. In our case we have the time_of_failure, factor_of_safety and depth attributes to include, which correspond to certain columns in the .csv file. The following command must be run in this case 3 times. Each time the file_paths_visualisation.json file must be updated with the attribute of interest and its column number in the corresponding .csv file (note this is 0-indexed), as well as with the appropriate input file (input_file_voronoi) and data source (either calibration or validation).

python convert_csv_to_shapefile.py

To convert from point shapefiles to a multipolygon shapefile using Voronoi tessellation. This assumes that the attributes are time_of_failure, factor_of_safety and depth. This combines files from all three attributes into a single shapefile.

python voronoi_with_attributes.py

Authors

Simon M. Mudd - University of Edinburgh

Boris Gailleton - University of Edinburgh - GFZ Potsdam

Guillaume ""Will"" Goodwin - University of Edinburgh - University of Padova

Marina Ruiz Sanchez-Oro - University of Edinburgh

Louis Kinnear - University of Edinburgh

About

A collection of tools for combining satellite-derived ground motion data with slope stability models.


Languages

Language:Python 93.1%Language:C++ 6.0%Language:CMake 0.9%