Skip to content

Commit

Permalink
Merge pull request #88 from blacha/fix/lerc-is-not-lossy
Browse files Browse the repository at this point in the history
fix: lerc is not lossy unless max_z_error is > 0
  • Loading branch information
zacdezgeo authored Oct 31, 2023
2 parents 27c7a24 + 2876e5e commit c40d3a5
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion cloud-optimized-geotiffs/cogs-details.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ There are a variety of compression codecs supported by GeoTIFF. Compression code

**JPEG** is a **lossy** compression codec useful for true-color GeoTIFFs intended to be used only for visualization. Because it's lossy, it tends to produce smaller file sizes than deflate or LZW. JPEG should only be used with RGB `Byte` data.

**LERC** (Limited Error Raster Compression) is a lossy but very efficient compression algorithm for floating point data. This compression rounds values to a precision provided by the user and tends to be useful e.g. for elevation data where the source data is known to not have precision beyond a known value. But note, this compression is **not lossless**. Additionally, LERC is a relatively new algorithm and may not be supported everywhere. GDAL needs to be compiled with the LERC driver in order to load a GeoTIFF with LERC compression.
**LERC** (Limited Error Raster Compression) is a very efficient compression algorithm for floating point data. This compression rounds values to a precision provided by the user and tends to be useful e.g. for elevation data where the source data is known to not have precision beyond a known value. But note, this compression is **not lossless** when used with a precision greater than 0. Additionally, LERC is a relatively new algorithm and may not be supported everywhere. GDAL needs to be compiled with the LERC driver in order to load a GeoTIFF with LERC compression.

Some other compression algorithms may be preferred depending on the data type and distribution, and if the goal is maximum compression or not. Codecs that produce the smallest file sizes _tend_ to take longer to read into memory. If the network bandwidth to load the file is slow, then a very efficient compression algorithm may be most efficient, even if it takes longer to decompress when loaded. Alternatively, if the network speed is very fast, or if reading from a local disk, a slightly less efficient compression codec that decompresses faster may be preferred.

Expand Down

0 comments on commit c40d3a5

Please sign in to comment.