Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spec: Support geo type #10981

Open
wants to merge 21 commits into
base: main
Choose a base branch
from
Open

Spec: Support geo type #10981

wants to merge 21 commits into from

Conversation

szehon-ho
Copy link
Collaborator

@szehon-ho szehon-ho commented Aug 20, 2024

This is the spec change for #10260.

Also this is based closely on the decisions taken in the Parquet proposal for the same : apache/parquet-format#240

@github-actions github-actions bot added the Specification Issues that may introduce spec changes. label Aug 20, 2024
@szehon-ho szehon-ho force-pushed the geo_spec_draft branch 5 times, most recently from a096921 to 19f24a4 Compare August 20, 2024 20:47
format/spec.md Outdated
XZ2 is based on the paper [XZ-Ordering: A Space-Filling Curve for Objects with Spatial Extensions].

Notes:
1. Resolution must be a positive integer. Defaults to TODO
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jiayuasu do you have any suggestion for default here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

12 sounds fine. CC @Kontinuation

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GeoMesa uses a high XZ2 resolution when working with key-value stores such as Accumulo and HBase, it is not appropriate to always use a resolution that high for partitioning data (for instance, GeoMesa on FileSystems).

XZ2 resolution 11~12 works for city-scale data, but will generate too many partitions for country-scale or world-scale data. I'd like to have a smaller default value such as 7 to be safe on various kinds of data.

image

format/spec.md Outdated
| **`struct`** | `group` | | |
| **`list`** | `3-level list` | `LIST` | See Parquet docs for 3-level representation. |
| **`map`** | `3-level map` | `MAP` | See Parquet docs for 3-level representation. |
| **`geometry`** | `binary` | `GEOMETRY` | WKB format, see Appendix G. Logical type annotation optional for supported Parquet format versions [1]. |
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could add this section later too, once its implemented (same for ORC below)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Appendix G](#appendix-g)

format/spec.md Outdated
| _optional_ | _optional_ | **`110 null_value_counts`** | `map<121: int, 122: long>` | Map from column id to number of null values in the column |
| _optional_ | _optional_ | **`137 nan_value_counts`** | `map<138: int, 139: long>` | Map from column id to number of NaN values in the column |
| _optional_ | _optional_ | **`111 distinct_counts`** | `map<123: int, 124: long>` | Map from column id to number of distinct values in the column; distinct counts must be derived using values in the file by counting or using sketches, but not using methods like merging existing distinct counts |
| _optional_ | _optional_ | **`125 lower_bounds`** | `map<126: int, 127: binary>` | Map from column id to lower bound in the column serialized as binary [1]. Each value must be less than or equal to all non-null, non-NaN values in the column for the file [2] For Geometry type, this is a Point composed of the min value of each dimension in all Points in the Geometry. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does this work? Does Iceberg need to interpret each WKB to produce this value? Will it be provided by Parquet?

Copy link
Collaborator Author

@szehon-ho szehon-ho Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, once we switch to Geometry logical type from Parquet we will get these stats from Parquet.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we mention that it is the parquet type BoundingBox?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea will add a footnote here

Copy link
Member

@jiayuasu jiayuasu Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@szehon-ho BTW, the reason why we had a separate bbox statistics in havasu is to be compatible with existing Iceberg tables. Since this is to add the native geometry support, so lower_bound and upper_bound are good choices.

Copy link
Member

@Kontinuation Kontinuation Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought that the bounds were stored as WKB-encoded points (according to Appendix D and G), and WKB encodes dimensions of geometries in the header. It is more consistent to make bound values the same type/representation as the field data type.

More sophisticated coverings in Parquet statistics cannot be easily mapped to lower_bounds and upper_bounds, so do we simply use the bbox statistics and ignore the coverings for now?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is ok since these two bounds are optional and in case they are not presented, it still follow the spec.

Copy link
Member

@jiayuasu jiayuasu Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it support different dimensions like XY, XYZ, XYM, XYZM? If yes, how can we tell if the binary is for XYZ or XYM?

We should say For Geometry type, this is a WKB-encoded Point composed of the min value of each dimension in all Points in the Geometry. Then we don't have to worry about the Z and M value.

CC @szehon-ho

Copy link
Collaborator Author

@szehon-ho szehon-ho Sep 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should say For Geometry type, this is a WKB-encoded Point composed of the min value of each dimension in all Points in the Geometry. Then we don't have to worry about the Z and M value.

@jiayuasu @Kontinuation @wgtmac Done, thanks.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@flyrain

Should we mention that it is the parquet type BoundingBox?

Actually looking again after some time, not sure how to mention this here, as that is filetype specific. This is an optional field, and only set if type is parquet and bounding_box is set, but that's implementation detail .

format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated
| **`void`** | Always produces `null` | Any | Source type or `int` |
| Transform name | Description | Source types | Result type |
|-------------------|--------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------|----------------------|
| **`identity`** | Source value, unmodified | Any | Source type |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Except for geometry?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rdblue @flyrain hmm is there a specific reason? I think it could work technically as Geometry can be compared, unless I miss something

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe that's fine if it is comparable, but practically people will always use xz2, right? I'm not sure, but wondering if there is some implications, e.g., too expensive, or super high cardinality, so that we don't recommend user to use the original GEO value as the partition spec.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea I think its possible to do (it's just the wkb value after all), you are right , not sure if any good use case. Yea we have to get the wkb in any case, i am not sure if its that expensive, but can check. But I guess the cardinality is the same consideration as any other type (uuid for example), and we let the user choose ?

format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated
| Transform name | Description | Source types | Result type |
|-------------------|--------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------|----------------------|
| **`identity`** | Source value, unmodified | Any | Source type |
| **`bucket[N]`** | Hash of value, mod `N` (see below) | `int`, `long`, `decimal`, `date`, `time`, `timestamp`, `timestamptz`, `timestamp_ns`, `timestamptz_ns`, `string`, `uuid`, `fixed`, `binary` | `int` |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we going to support bucketing on GEO?

Copy link
Collaborator Author

@szehon-ho szehon-ho Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think its possible, again not sure the utility. Geo boils down to just WKB bytes

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel that the argument for identity can apply here as well. In that case, we can support it, but it's users' call to use it or not.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We may want to change this to be like identity, using Any except [...].

I would not include geo as a source column for bucketing because there is not a clear definition of equality for geo. The hash would depend on the structure of the object and weird things happen when two objects are "equal" (for some definition) but have different hash values.

format/spec.md Outdated
@@ -198,6 +199,9 @@ Notes:
- Timestamp values _with time zone_ represent a point in time: values are stored as UTC and do not retain a source time zone (`2017-11-16 17:10:34 PST` is stored/retrieved as `2017-11-17 01:10:34 UTC` and these values are considered identical).
- Timestamp values _without time zone_ represent a date and time of day regardless of zone: the time value is independent of zone adjustments (`2017-11-16 17:10:34` is always retrieved as `2017-11-16 17:10:34`).
3. Character strings must be stored as UTF-8 encoded byte arrays.
4. Coordinate Reference System, i.e. mapping of how coordinates refer to precise locations on earth. Defaults to "OGC:CRS84". Fixed and cannot be changed by schema evolution.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. When we say OGC:CRS84, the value you put in this field should be the following PROJJSON string (see GeoParquet spec)
{
    "$schema": "https://proj.org/schemas/v0.5/projjson.schema.json",
    "type": "GeographicCRS",
    "name": "WGS 84 longitude-latitude",
    "datum": {
        "type": "GeodeticReferenceFrame",
        "name": "World Geodetic System 1984",
        "ellipsoid": {
            "name": "WGS 84",
            "semi_major_axis": 6378137,
            "inverse_flattening": 298.257223563
        }
    },
    "coordinate_system": {
        "subtype": "ellipsoidal",
        "axis": [
        {
            "name": "Geodetic longitude",
            "abbreviation": "Lon",
            "direction": "east",
            "unit": "degree"
        },
        {
            "name": "Geodetic latitude",
            "abbreviation": "Lat",
            "direction": "north",
            "unit": "degree"
        }
        ]
    },
    "id": {
        "authority": "OGC",
        "code": "CRS84"
    }
}
  1. Both crs and crs_kind field are optional. But when the CRS field presents, the crs_kind field must present. In this case, since we hard code this crs field in this phase, then we need to set crs_kind field (string) to PROJJSON.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we include this example in the parquet spec as well?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, should we advise accepted forms or values for CRS and Edges?

Copy link
Member

@jiayuasu jiayuasu Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wgtmac We should add this value to the Parquet spec for sure. CC @zhangfengcdt

@szehon-ho There is another situation mentioned in the GeoParquet spec: If the CRS field presents but its value is null, it means the data is in unknown CRS. This situation happens sometimes because the writer somehow cannot find or lose the CRS info. Do we want to support this? I think we can use the empty string to cover this case

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

accepted forms or values for CRS and Edges

If we borrow the conclusion from Parquet Geometry proposal, then C,T,E fields are the follows:

C is a string. Based on what I understand from this PR, @szehon-ho made this field a required field, which is fine.
T is optional and a string. Currently, it only allows this value PROJJSON. When it is not provided, it defaults to PROJJSON too.
E is a string. The only allowed value is PLANAR in this phase. Based on what I understand from this PR, @szehon-ho made this field a required field, which is fine. @szehon-ho According to our meeting with Snowflake, I think maybe we can allow SPHERICAL too? We can add in the spec that: currently it is unsafe to perform partition transform / bounding box filtering when E = SPHERICAL because they are built based on PLANAR edges. It is the reader's responsibility to decide if they want to use partition transform / bounding box filtering.

Copy link
Member

@jiayuasu jiayuasu Sep 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@szehon-ho

  1. Does it make sense to include the following CRS84 example from the Parquet Geometry PR?
  /**
   * Coordinate Reference System, i.e. mapping of how coordinates refer to
   * precise locations on earth. Writers are not required to set this field.
   * Once crs is set, crs_encoding field below MUST be set together.
   * For example, "OGC:CRS84" can be set in the form of PROJJSON as below:
   * {
   *     "$schema": "https://proj.org/schemas/v0.5/projjson.schema.json",
   *     "type": "GeographicCRS",
   *     "name": "WGS 84 longitude-latitude",
   *     "datum": {
   *         "type": "GeodeticReferenceFrame",
   *         "name": "World Geodetic System 1984",
   *         "ellipsoid": {
   *             "name": "WGS 84",
   *             "semi_major_axis": 6378137,
   *             "inverse_flattening": 298.257223563
   *         }
   *     },
   *     "coordinate_system": {
   *         "subtype": "ellipsoidal",
   *         "axis": [
   *         {
   *             "name": "Geodetic longitude",
   *             "abbreviation": "Lon",
   *             "direction": "east",
   *             "unit": "degree"
   *         },
   *         {
   *             "name": "Geodetic latitude",
   *             "abbreviation": "Lat",
   *             "direction": "north",
   *             "unit": "degree"
   *         }
   *         ]
   *     },
   *     "id": {
   *         "authority": "OGC",
   *         "code": "CRS84"
   *     }
   * }
   */
  1. It is ok to have them all fixed to default values for this phase.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jiayuasu i put it in the example (if you render the page). let me know if its not what you meant.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that this example is non-standard. A CRS JSON encoding is in progress at OGC, but we don't know yet what it will look like. The only standards at this time are GML, WKT, or simply put a reference to the EPSG database or OGC registry (e.g. https://www.opengis.net/def/crs/OGC/0/CRS84 or "urn:ogc:def:crs:OGC::CRS84").

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Coordinate Reference System, i.e. mapping of how coordinates refer to precise locations on earth. Defaults to "OGC:CRS84".

Maybe remove the word "precise"? Not all CRS are precise. The proposed default, OGC:CRS84, has an uncertainty of about 2 meters (unrelated to floating point precision). Maybe it would be worth to change the sentence to: Defaults to "OGC:CRS84", which provides an accuracy of about 2 meters.

Another reason for removing the "precise" word is that the CRS alone is not sufficient for really precise locations. We also need the epoch, which is a topic that has not been covered at all in this PR.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed precise

format/spec.md Outdated
@@ -190,6 +190,7 @@ Supported primitive types are defined in the table below. Primitive types added
| | **`uuid`** | Universally unique identifiers | Should use 16-byte fixed |
| | **`fixed(L)`** | Fixed-length byte array of length L | |
| | **`binary`** | Arbitrary-length byte array | |
| [v3](#version-3) | **`geometry(C, T, E)`** | An object of the simple feature geometry model as defined by Appendix G; This may be any of the Geometry subclasses defined therein; coordinate reference system C [4], coordinate reference system type T [5], edges E [6] | C, T, E are fixed. Encoded as WKB, see Appendix G. |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What syntax to use for an engine to create the geometry type? Does it require C/T/E to appear in the type?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Related to above comment, I think these will all be optional (take a default value if not specified).

format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated Show resolved Hide resolved
@szehon-ho
Copy link
Collaborator Author

szehon-ho commented Aug 24, 2024

Hi all fyi i have unfortunately encountered some problems while remote and probably cant update this, will come back to this after i get back home in two weeks.

@szehon-ho szehon-ho force-pushed the geo_spec_draft branch 2 times, most recently from 75326dc to 0591f68 Compare September 11, 2024 21:37
@szehon-ho
Copy link
Collaborator Author

@jiayuasu @Kontinuation @wgtmac @flyrain @rdblue sorry for the delay, as I only got access now. Updated the pr.

format/spec.md Outdated

Notes:

1. Timestamp values _without time zone_ represent a date and time of day regardless of zone: the time value is independent of zone adjustments (`2017-11-16 17:10:34` is always retrieved as `2017-11-16 17:10:34`).
2. Timestamp values _with time zone_ represent a point in time: values are stored as UTC and do not retain a source time zone (`2017-11-16 17:10:34 PST` is stored/retrieved as `2017-11-17 01:10:34 UTC` and these values are considered identical).
3. Character strings must be stored as UTF-8 encoded byte arrays.
4. Crs (coordinate reference system) is a mapping of how coordinates refer to precise locations on earth. Defaults to "OGC:CRS84". Fixed and cannot be changed by schema evolution.
5. Crs-encoding (coordinate reference system encoding) is the type of crs field. Must be set if crs is set. Defaults to "PROJJSON". Fixed and cannot be changed by schema evolution.
6. Edges is the interpretation for non-point geometries in geometry object, i.e. whether an edge between points represent a straight cartesian line or the shortest line on the sphere. Defaults to "planar". Fixed and cannot be changed by schema evolution.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we maybe explicitly mention here that both "planar" and "spherical" are supported as edge type enum values?

Copy link
Collaborator Author

@szehon-ho szehon-ho Sep 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dmitrykoval i was debating this.

I guess we talked about it before, but the Java reference implementation, we cannot easily do pruning (file level, row level, or partition level) because the JTS library and the XZ2 only support non-spherical. We would need new metrics types, new Java libraries , and new partition transform proposals if we wanted to support it in Java reference implementation.

But if we want to support it, Im ok to list it here and have checks to just skip pruning for spherical geometry columns.
@flyrain @jiayuasu @Kontinuation does it make sense?

Copy link

@dmitrykoval dmitrykoval Sep 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I think if "planar" is the default edge type, then there shouldn't be many changes to the planar geometry code path, except for additional checks to skip some partitioning/pruning cases, right?

Regarding the reference implementation of the "spherical" type, do we need to fully support it from day one, or can we maybe mark it as optional in the initial version of the spec? For example, it would work if the engine supports it, but by default, we would fall back to the planar edge type?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could list spherical as an allowed edge type here. Maybe just mark it that it is not safe to perform partition transform or lower_bound/upper_bound filtering when the edge is spherical. We did the same in the Parquet Geometry PR.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea , forgot to mention explicitly that in Iceberg, pruning is always an optional feature for reads, so no issue.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Crs-encoding (coordinate reference system encoding) is the type of crs field.

  • Please change "is the type" by "is the encoding". The CRS type is something completely different than the information put in this field.
  • (minor detail) "crs" should be upper-case "CRS" as it is an acronym for "Coordinate Reference System".

Defaults to "PROJJSON".

I suggest to remove this specification. PROJJSON is non-standard and may never be (we don't know yet what will be OGC's decision). I don't think that we want to tie Iceberg forever to a format specific to one project.

format/spec.md Outdated
| _optional_ | _optional_ | **`110 null_value_counts`** | `map<121: int, 122: long>` | Map from column id to number of null values in the column |
| _optional_ | _optional_ | **`137 nan_value_counts`** | `map<138: int, 139: long>` | Map from column id to number of NaN values in the column |
| _optional_ | _optional_ | **`111 distinct_counts`** | `map<123: int, 124: long>` | Map from column id to number of distinct values in the column; distinct counts must be derived using values in the file by counting or using sketches, but not using methods like merging existing distinct counts |
| _optional_ | _optional_ | **`125 lower_bounds`** | `map<126: int, 127: binary>` | Map from column id to lower bound in the column serialized as binary [1]. Each value must be less than or equal to all non-null, non-NaN values in the column for the file [2] For geometry type, this is a WKB-encoded point composed of the min value of each dimension among all component points of all geometry objects for the file. |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For geometry type, this is a WKB-encoded point composed of the min value of each dimension among all component points of all geometry objects for the file.

As we are finishing the PoC on the Parquet side, the remaining issue is what value to write to min_value/max_value fields of statistics and page index. To give some context, Parquet requires min_value/max_value fields to be set for page index and statistics are used to generate page index. The C++ PoC is omitting min_value/max_value values and the Java PoC is pretending geometry values are plain binary values while collecting the stats. Should we do similar things here? Then the Iceberg code can directly consume min_value/max_value from statistics instead of issuing another call to get the specialized GeometryStatistics which is designed for advanced purpose.

@Kontinuation @zhangfengcdt @paleolimbot @jiayuasu

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wgtmac do you mean that Iceberg uses the Parquet Geometry GeometryStatistics or Parquet Geometry uses the min_value/max_value idea from Iceberg?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean the latter. The ColumnOrder of the new geometry type is undefined as specified at https://github.com/apache/parquet-format/pull/240/files#diff-834c5a8d91719350b20995ad99d1cb6d8d68332b9ac35694f40e375bdb2d3e7cR1144. It means that the min_value/max_value fields are meaningless and should not be used. I'm not sure if it is a good idea to set min_value/max_value fields in the same way as lower_bounds/upper_bounds of Iceberg.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest defining the sort order of geometry columns as WKB-encoded points in the parquet format spec. This is the most simple yet useful way of defining the min and max bounds for geometry columns, and the sort order is better to be well-defined rather than left undefined.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that it is better to explicitly define the column order than being undefined. If we go with this approach, the format PR and two PoC impls need to reflect this change, which might get more complicated.

Copy link
Collaborator Author

@szehon-ho szehon-ho Sep 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there anything for this specific line we need to change? As long as we get from Parquet some way we are ok here, but is the format of the lower/upper bound still ok?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, I was thinking if Parquet could do better by doing similar things in the future.

format/spec.md Outdated
@@ -200,12 +200,16 @@ Supported primitive types are defined in the table below. Primitive types added
| | **`uuid`** | Universally unique identifiers | Should use 16-byte fixed |
| | **`fixed(L)`** | Fixed-length byte array of length L | |
| | **`binary`** | Arbitrary-length byte array | |
| [v3](#version-3) | **`geometry(C, CE, E)`** | An object of the simple feature geometry model as defined by Appendix G; This may be any of the geometry subclasses defined therein; crs C [4], crs-encoding CE [5], edges E [6] | C, CE, E are fixed, and if unset will take default values. |
Copy link
Member

@RussellSpitzer RussellSpitzer Sep 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think maybe we should just link out for the requirements here since it's a bit complicated.

The description as well could be

Simple feature geometry Appendix G, Parameterized by ....

I also don't think we should allow it to be unset ... can we just require that a subclass is always picked? We could recommend a set of defaults for engines to set on field creation but I'm not sure we need to be that opinionated here.

Copy link
Member

@paleolimbot paleolimbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Collecting all of our comments here is truly heroic work...thank you!

I can only speak to the non-iceberg techincal details here, but this PR as written contains metadata that will not cause problems when converted to other geospatial types (i.e., GeoPandas, GeoArrow, GeoParquet), and contains an unambiguous and implementable mechanism for pushing down a basic spatial filter.

format/spec.md Outdated
@@ -205,13 +205,18 @@ Supported primitive types are defined in the table below. Primitive types added
| | **`uuid`** | Universally unique identifiers | Should use 16-byte fixed |
| | **`fixed(L)`** | Fixed-length byte array of length L | |
| | **`binary`** | Arbitrary-length byte array | |
| [v3](#version-3) | **`geometry(C)`** | Geometry features from [OGC – Simple feature access](https://portal.ogc.org/files/?artifact_id=25355). Edges interpolation is always linear/planar. See [Appendix G](#appendix-g-geospatial-notes). Parameterized by crs C [4]. If not specified, C is `OGC:CRS84`. | |
| [v3](#version-3) | **`geography(C, A)`** | Geometry features from [OGC – Simple feature access](https://portal.ogc.org/files/?artifact_id=25355). See [Appendix G](#appendix-g-geospatial-notes). Parameterized by crs C[5] and edge-interpolation algoritm A [6]. If not specified, C is `OGC:CRS84`. | |
Copy link
Contributor

@rdblue rdblue Jan 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nits:

  • Duplicated URL can be moved into a reference using [OGC - Simple feature access][label]
  • Missing a space between C and [5]
  • It would be good to add the list of algorithms after the table and before notes, rather than in the appendix. Those algorithm definitions are part of the spec, right?
  • "crs" (lower case) is used here, but "CRS" is used in the note that defines CRS. I think it should be upper case everywhere

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, took these suggestions.

@@ -449,7 +454,7 @@ Partition field IDs must be reused if an existing partition spec contains an equ

| Transform name | Description | Source types | Result type |
|-------------------|--------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------|-------------|
| **`identity`** | Source value, unmodified | Any | Source type |
| **`identity`** | Source value, unmodified | Any except `geometry` and `geography` | Source type |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we were also supposed to include Variant in the exceptions. @aihuaxu can you take a look?

format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated

The `partition` struct stores the tuple of partition values for each file. Its type is derived from the partition fields of the partition spec used to write the manifest file. In v2, the partition struct's field ids must match the ids from the partition spec.
7. `geometry` and `geography`: this is a point: X, Y, Z, and M are the lower / upper bound of all objects in the file. For the X and Y values only, the lower_bound's values (xmin/ymin) may be greater than the upper_bound's value (xmax/ymax). In this X case, an object in the file may match if it contains an X such that `x >= xmin` OR `x <= xmax`, and in this Y case if `y >= ymin` OR `y <= ymax`. In geographic terminology, the concepts of `xmin`, `xmax`, `ymin`, and `ymax` are also known as `westernmost`, `easternmost`, `northernmost` and `southernmost`.
8. `geography` further restricts these points to the canonical ranges of [-180 180] for X and [-90 90] for Y.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would probably not make these notes. They don't clarify the existing definition or add minor details, this is the full definition for geo types. I would just make these into a separate paragraph.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a separate paragraph.

format/spec.md Outdated

Notes:

1. Timestamp values _without time zone_ represent a date and time of day regardless of zone: the time value is independent of zone adjustments (`2017-11-16 17:10:34` is always retrieved as `2017-11-16 17:10:34`).
2. Timestamp values _with time zone_ represent a point in time: values are stored as UTC and do not retain a source time zone (`2017-11-16 17:10:34 PST` is stored/retrieved as `2017-11-17 01:10:34 UTC` and these values are considered identical).
3. Character strings must be stored as UTF-8 encoded byte arrays.

4. CRS (coordinate reference system) is a mapping of how coordinates refer to locations on Earth. See [Appendix G](#appendix-g-geospatial-notes) for specifying custom CRS. If this field is null (no custom crs provided), CRS defaults to `OGC:CRS84`, which means the data must be stored in longitude, latitude based on the WGS84 datum. Fixed and cannot be changed by schema evolution.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The text about schema evolution should be in the evolution section rather than here.

The default value is already in the table so it doesn't need to be in the note.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, removed.

Took a look and didn't modify the schema evolution section, as it already lists supported type promotions, and so lack of mentioning geography and geometry implies its not supported.

format/spec.md Outdated

Notes:

1. Timestamp values _without time zone_ represent a date and time of day regardless of zone: the time value is independent of zone adjustments (`2017-11-16 17:10:34` is always retrieved as `2017-11-16 17:10:34`).
2. Timestamp values _with time zone_ represent a point in time: values are stored as UTC and do not retain a source time zone (`2017-11-16 17:10:34 PST` is stored/retrieved as `2017-11-17 01:10:34 UTC` and these values are considered identical).
3. Character strings must be stored as UTF-8 encoded byte arrays.

4. CRS (coordinate reference system) is a mapping of how coordinates refer to locations on Earth. See [Appendix G](#appendix-g-geospatial-notes) for specifying custom CRS. If this field is null (no custom crs provided), CRS defaults to `OGC:CRS84`, which means the data must be stored in longitude, latitude based on the WGS84 datum. Fixed and cannot be changed by schema evolution.
5. See [4]. This must be a geographic CRS, where longitudes are bound by [-180, 180] and latitudes are bound by [-90, 90].
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than referring to a note in a note, I would just add both to the description. [4] clarifies what a CRS is and [5] puts restrictions on the ones that are valid for geography.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like same as above point? I put it in a separate paragraph.

format/spec.md Outdated
@@ -603,8 +608,9 @@ Notes:
4. Position delete metadata can use `referenced_data_file` when all deletes tracked by the entry are in a single data file. Setting the referenced file is required for deletion vectors.
5. The `content_offset` and `content_size_in_bytes` fields are used to reference a specific blob for direct access to a deletion vector. For deletion vectors, these values are required and must exactly match the `offset` and `length` stored in the Puffin footer for the deletion vector blob.
6. The following field ids are reserved on `data_file`: 141.

The `partition` struct stores the tuple of partition values for each file. Its type is derived from the partition fields of the partition spec used to write the manifest file. In v2, the partition struct's field ids must match the ids from the partition spec.
7. `geometry` and `geography`: this is a point: X, Y, Z, and M are the lower / upper bound of all objects in the file. For the X and Y values only, the lower_bound's values (xmin/ymin) may be greater than the upper_bound's value (xmax/ymax). In this X case, an object in the file may match if it contains an X such that `x >= xmin` OR `x <= xmax`, and in this Y case if `y >= ymin` OR `y <= ymax`. In geographic terminology, the concepts of `xmin`, `xmax`, `ymin`, and `ymax` are also known as `westernmost`, `easternmost`, `northernmost` and `southernmost`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this also specify where to see the encoding for the point?

Copy link
Collaborator Author

@szehon-ho szehon-ho Jan 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think its specified a little above in point 1,

1. Single-value serialization for lower and upper bounds is detailed in Appendix D.

is it sufficient?

format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated Show resolved Hide resolved
@@ -1480,6 +1494,9 @@ This serialization scheme is for storing single values as individual binary valu
| **`struct`** | Not supported |
| **`list`** | Not supported |
| **`map`** | Not supported |
| **`geometry`** | A single point, encoded as a {x, y, optional z, optional m} concatenation of its 8-byte little-endian IEEE 754 coordinate values, with the optional coordinates encoded as NaN if unset. |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this table is the right place to define this encoding because this covers any value and the encoding for arbitrary geometry and geography values is WKB.

For the lower and upper bounds, I suggest adding a new section and documenting the way we store points specifically for both types.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably change the text above this table, it specifies this is for lower and upper bounds.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea I was also thrown off by the above text as specified by Russell. Changed the text and added a subsection, let me know if it is what you mean.

@@ -1506,6 +1523,8 @@ This serialization scheme is for storing single values as individual binary valu
| **`struct`** | **`JSON object by field ID`** | `{"1": 1, "2": "bar"}` | Stores struct fields using the field ID as the JSON field name; field values are stored using this JSON single-value format |
| **`list`** | **`JSON array of values`** | `[1, 2, 3]` | Stores a JSON array of values that are serialized using this JSON single-value format |
| **`map`** | **`JSON object of key and value arrays`** | `{ "keys": ["a", "b"], "values": [1, 2] }` | Stores arrays of keys and values; individual keys and values are serialized using this JSON single-value format |
| **`geometry`** | **`JSON string`** | `POINT (30 10)` | Stored using WKT representation, see [Appendix G](#appendix-g-geospatial-notes) |
| **`geography`** | **`JSON string`** | `POINT (30 10)` | Stored using WKT representation, see [Appendix G](#appendix-g-geospatial-notes) |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like above, this table covers arbitrary geometry and geography objects. I think it is fine to use WKT here and not have a separate way to store points, but we should be aware that the implication is that WKT is required for any JSON representation.

format/spec.md Outdated

The version of the OGC standard first used here is 1.2.1, but future versions may also used if the WKB representation remains wire-compatible.

Coordinate axis order is always (x, y) where x is easting or longitude, and y is northing or latitude. This ordering explicitly overrides the axis order specified in the CRS.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find it surprising that we would specify this. I think it will cause problems because people would easily not know about it and could easily have mixed WKB data values (some conforming to the CRS and some explicitly following this and not the CRS).

What is the purpose of this?

  • If it is intended only for the points that we use for lower and upper bounds, then I think we should be strict in the binary representation we use for those points and not make a general statement.
  • If this is intended to apply to all WKB stored in Iceberg tables, then I think a better option is to disallow CRS definitions that use a different axis order. However, I don't see a reason why we would restrict the axis order because WKB is already opaque to Iceberg.

@mkaravel, what do you think?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a note that it's fine either way for me because srid is opaque (i.e., the engine is free to map 4326 however it wants, even though EPSG:4326 is defined reversed to how it is represented in almost all databases/file formats), and because projjson is capable of representing flipping the axes based on their name if needed. I do think it will cause considerable confusion/a lot of unexpectedly sideways maps if omitted; however, but it is not strictly required).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rdblue

It was added by me and it is inspired by the GeoParquet spec. Two reasons:

  1. Defining the axis order explicitly in lon/lat (x = lon, y = lat) is a common practice in mapping tools (QGIS, Google Maps, KeplerGL, ...). This will reduce confusion when visualizing spatial data.
  2. This made it possible that an engine can read and process Geometry type data without understanding the CRS field. Many ST functions require explicit lon/lat order (ST_DistanceSpheroid, H3, S2, ...).

I don't have a strong opinion here. @szehon-ho please feel free to remove it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed, thanks.

format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated Show resolved Hide resolved
format/spec.md Outdated
The edge-interpolation algorithm is specified as a parameter (A) for `geography` types.

Supported values are:
* `spherical`: edges are interpolated as geodesics on a sphere. The radius of the underlying sphere is the mean radius of the spheroid defined by the CRS, defined as (2 * major_axis_length + minor_axis_length / 3).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@szehon-ho, @mkaravel, did we determine if we need to specify the radius? If we do not, then I would remove it here so that we don't create confusion or over-specify the algorithm.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@paleolimbot Dewey, can you also look into this?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@paleolimbot mentioned that that we do not really need the radius to determine the maximum and minimum latitudes for the bounding box of a geographic edge under the spherical model, and this is true. I think we can remove the requirement about the radius (second sentence here).

@szehon-ho szehon-ho force-pushed the geo_spec_draft branch 2 times, most recently from 7a0d825 to 7d06e1f Compare January 10, 2025 07:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
spark Specification Issues that may introduce spec changes.
Projects
None yet
Development

Successfully merging this pull request may close these issues.