-
Notifications
You must be signed in to change notification settings - Fork 11
Home
Welcome to the GeoFabrics wiki!
The GeoFabrics
and associated sub-packages include routines and classes for combining point (i.e. LiDAR), vector (i.e. catchment of interest, infrastructure) and raster (i.e. reference DEM) to generate a hydrologically conditioned raster. This work has been initiated under the Endeavour funded Mā te haumaru ō te wai (Increasing flood resilience across Aotearoa) project.
Support has been added for distributed computing environments using Dask, so that DEM generation can be spread across many processors.
GeoFabrics
is a Python library. Black is used to ensure strict adherence to PEP8 standards and a line length of 88.
GeoFabrics
utilises the geoapis python package for downloading publicly available geospatial data. This is currently maintained by the same people as GeoFabrics
and was split out of the GeoFabrics
repository.
Sphinx combined with GitHub Pages is used to create web-hosted documentation from the embedded docstrings in the source code at https://rosepearson.github.io/GeoFabrics/
Documentation specific to general usage, installation, testing and contribution of GeoFabrics can be found in the Wiki pages. See the sidebar for a listing of all pages.
The following diagram shows the package module and class structures. Inheritance is marked through colour connections, and classes included in other classes in indicated with arrows. The processor
module contains pipelines that generate DEMs or other outputs based on the contents of instruction files. The geometry
, dem
and bathymetry_estimation
modules contain classes to help with these generation pipelines.
The core DEM generation processing chain in GeoFabrics
is contained in the processor module. The code flow is controlled within the run
routines of the various pipeline classes (inherited from the same abstract BaseProcessor
class) based on the contents of an JSON instruction
file passed in at construction. The instruction file (link to wiki page) specifies the data sources to use during the DEM generation as well as other code flow logic. Dask
is used to allow parallel processing across many CPU cores hydrologically conditioned DEM from LiDAR with more details under performance and bench-marking.
The main.py
script is used as an entry point to the library. It determines which processor class(es) to run based on the instruction file contents. See the instruction file page for details on how main.py
selects which processor classes to run, and the Basic usage page for details on how to run main.py
.
Each of pipeline class supports slightly different functionality as described below.
-
The
RawLidarDemGenerator
class - Construct a raw DEM from LiDAR tiles. Filter the LiDAR based on the specified ground classifications. This is computationally intensive. -
The
RiverBathymetryGenerator
class - Estimate river bathymetry depths from river flows, slopes, frictions and widths along a main channel, where the channel width and slope are estimated from DEM generated from LiDAR. More details at River Bathymetry estimation. -
The
DrainBathymetryGenerator
class - Estimate waterway bed elevations from Open Street Maps (OSM). Drains and streams are considered, while rivers are ignored (these are considered in theRiverBathymetryGenerator
class). In the case of open waterways (i.e. notunnel
tag), no increases in elevation are enforced down slope. In the case of closed waterways (i.e.tunnel
tag), the lowest elevation in the area is taken as the bed elevation. -
The
HydrologicDemGenerator
class - Construct a new hydrologically conditioned DEM from a raw DEM created by theRawLidarDemGenerator
class and the specified ocean (from LINZ), river (fromRiverBathymetryGenerator
) and drain (fromDrainBathymetryGenerator
) bathymetry information. -
The
RoughnessGenerator
class - Add a roughness length layer to a hydrologically conditioned DEM, where the roughness length is related to the mean height and standard deviation of the ground and ground cover LiDAR points in each grid cell. More details at Roughness length estimation.
Basic instructions for running these processor pipelines are outlined in the Basic Usage Instructions page.