Skip to content
Rose Pearson edited this page Apr 28, 2022 · 72 revisions

Welcome to the GeoFabrics wiki!

Introduction

The GeoFabrics and associated sub-packages include routines and classes for combining point (i.e. LiDAR), vector (i.e. catchment of interest, infrastructure) and raster (i.e. reference DEM) to generate a hydrologically conditioned raster. This work has been initiated under the Endeavour funded Mā te haumaru ō te wai (Increasing flood resilience across Aotearoa) project.

Support has been added for distributed computing environments using Dask, so that DEM generation can be spread across many processors.

GeoFabrics is a Python library. Black is used to ensure strict adherence to PEP8 standards.

DEM_generation_workflow

Fetching geospatial data from web APIs

GeoFabrics utilises the geoapis python package for downloading publicly available geospatial data. This is currently maintained by the same people as GeoFabrics and was split out of the GeoFabrics repository.

Documentation

API Docs

Sphinx combined with GitHub Pages is used to create web-hosted documentation from the embedded docstrings in the source code at https://rosepearson.github.io/GeoFabrics/

Wiki pages

Documentation specific to general usage, installation, testing and contribution of GeoFabrics can be found in the Wiki pages.

Package structure

The following diagram shows the package module and class structures. Inheritance is marked through colour connections, and classes included in other classes in indicated with arrows. The processor module contains pipelines that generate DEMs or other outputs based on the contents of instruction files. The geometry, dem and bathymetry_estimation modules contain classes to help with these generation pipelines.

image

DEM generation routines - processor module

The core DEM generation processing chain in GeoFabrics is contained in the processor module. The code flow is controlled within the run routines of the various pipeline classes (inherited from the same abstract BaseProcessor class) based on the contents of an JSON instruction file passed in at construction. The instruction file (link to wiki page) specifies the data sources to use during the DEM generation as well as other code flow logic. Dask is used to allow parallel processing across many CPU cores hydrologically conditioned DEM from LiDAR with more details under performance and bench-marking.

Each of pipeline class supports slightly different functionality as described below.

  • The LidarDemGenerator class - Construct a hydrologically conditioned DEM from LiDAR and bathymetry information. Generally most of the execution time is spent on the creation of the DEM from LiDAR, so a DEM of just the LiDAR region and its extents is also saved in case we want to add new bathymetry in the future.

  • The BathymetryDemGenerator class - Construct a new hydrologically conditioned DEM from a DEM created by the LidarDemGenerator class and some bathymetry information. This allows additional bathymetry information to be added as it becomes available without the computation load associated with generating a DEM from LiDAR.

  • The RiverBathymetryGenerator class - Estimate river bathymetry depths from river flows, slopes, frictions and widths along a main channel, where the channel width and slope are estimated from DEM generated from LiDAR. More details at River Bathymetry estimation.

Basic instructions for running these processor pipelines are outlined in the Basic Usage Instructions page.

Clone this wiki locally