Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PARQUET-2471: Add geometry logical type #240

Open
wants to merge 31 commits into
base: master
Choose a base branch
from
Open

Conversation

wgtmac
Copy link
Member

@wgtmac wgtmac commented May 10, 2024

Apache Iceberg is adding geospatial support: https://docs.google.com/document/d/1iVFbrRNEzZl8tDcZC81GFt01QJkLJsI9E2NBOt21IRI. It would be good if Apache Parquet can support geometry type natively.

@jiayuasu
Copy link
Member

@wgtmac Thanks for the work. On the other hand, I'd like to highlight that GeoParquet (https://github.com/opengeospatial/geoparquet/tree/main) has been there for a while and many geospatial software has started to support reading and writing it.

Is the ultimate goal of this PR to merge GeoParquet spec into Parquet completely, or it might end up creating a new spec that is not compatible with GeoParquet?

@jiayuasu
Copy link
Member

Geo Iceberg does not need to conform to GeoParquet because people should not directly use a parquet reader to read iceberg parquet files anyways. So that's a separate story.

@wgtmac
Copy link
Member Author

wgtmac commented May 11, 2024

Is the ultimate goal of this PR to merge GeoParquet spec into Parquet completely, or it might end up creating a new spec that is not compatible with GeoParquet?

@jiayuasu That's why I've asked the possibility of direct compliance to the GeoParquet spec in the Iceberg design doc. I don't intend to create a new spec. Instead, it would be good if the proposal here can meet the requirement of both Iceberg and GeoParquet, or share the common stuff to make the conversion between Iceberg Parquet and GeoParquet lightweight. We do need advice from the GeoParquet community to make it possible.

Copy link

@szehon-ho szehon-ho left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From Iceberg side, I am excited about this, I think it will make Geospatial inter-op easier in the long run to define the type formally in parquet-format, and also unlock row group filtering. For example, Iceberg's add_file for parquet file. Perhaps there can be conversion utils for GeoParquet if we go ahead with this, and definitely like to see what they think.

Im new in parquet side, so had some questions

src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
@wgtmac wgtmac marked this pull request as ready for review May 11, 2024 16:13
@wgtmac wgtmac changed the title WIP: Add geometry logical type PARQUET-2471: Add geometry logical type May 11, 2024
@pitrou
Copy link
Member

pitrou commented May 15, 2024

@paleolimbot is quite knowledgeable on the topic and could probably be give useful feedback.

@pitrou
Copy link
Member

pitrou commented May 15, 2024

I wonder if purely informative metadata really needs to be represented as Thrift types. When we define canonical extension types in Arrow, metadata is generally serialized as a standalone JSON string.

Doing so in Parquet as well would lighten the maintenance workload on the serialization format, and would also allow easier evolution of geometry metadata to support additional information.

Edit: this seems to be the approach adopted by GeoParquet as well.

Copy link
Member

@paleolimbot paleolimbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if purely informative metadata really needs to be represented as Thrift types. When we define canonical extension types in Arrow, metadata is generally serialized as a standalone JSON string.

In reading this I do wonder if there should just be an extension mechanism here instead of attempting to enumerate all possible encodings in this repo. The people that are engaged and working on implementations are the right people to engage here, which is why GeoParquet and GeoArrow have been successful (we've engaged the people who care about this, and they are generally not paying attention to apache/parquet-format nor apache/arrow).

There are a few things that this PR solves in a way that might not be possible using EXTENSION, which is that of column statistics. It would be nice to have some geo-specific things there (although maybe that can also be part of the extension mechanism). Another thing that comes up frequently is where to put a spatial index (rtree)...I don't think there's any good place for that at the moment.

It would be nice to allow the format to be extended in a way that does not depend on schema-level metadata...this metadata is typically propagated through projections and the things we do in the GeoParquet standard (store bounding boxes, refer to columns by name) become stale with the ways that schema metadata are typically propagated through projections and concatenations.

src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
src/main/thrift/parquet.thrift Outdated Show resolved Hide resolved
@wgtmac
Copy link
Member Author

wgtmac commented May 17, 2024

I wonder if purely informative metadata really needs to be represented as Thrift types. When we define canonical extension types in Arrow, metadata is generally serialized as a standalone JSON string.

Doing so in Parquet as well would lighten the maintenance workload on the serialization format, and would also allow easier evolution of geometry metadata to support additional information.

Edit: this seems to be the approach adopted by GeoParquet as well.

@pitrou Yes, that might be an option. Then we can perhaps use the same json string defined in the iceberg doc. @jiayuasu @szehon-ho WDYT?

EDIT: I think we can remove those informative attributes like subtype, orientation, edges. Perhaps encoding can be removed as well if we only support WKB. dimension is something that we must be aware of because we need to build bbox which depends on whether the coordinate is represented as xy, xyz, xym and xyzm.

@wgtmac
Copy link
Member Author

wgtmac commented May 17, 2024

Another thing that comes up frequently is where to put a spatial index (rtree)

I thought this can be something similar to the page index or bloom filter in parquet, which are stored somewhere between row groups or before the footer. It can be row group level or file level as well.

It would be nice to allow the format to be extended in a way that does not depend on schema-level metadata.

I think we really need your advise here. If you rethink the design of GeoParquet, how can it do better if parquet format has some geospatial knowledge? @paleolimbot @jiayuasu

@paleolimbot
Copy link
Member

If you rethink the design of GeoParquet, how can it do better if parquet format has some geospatial knowledge?

The main reasons that the schema level metadata had to exist is because there was no way to put anything custom at the column level to give geometry-aware readers extra metadata about the column (CRS being the main one) and global column statistics (bbox). Bounding boxes at the feature level (worked around as a separate column) is the second somewhat ugly thing, which gives reasonable row group statistics for many things people might want to store. It seems like this PR would solve most of that.

I am not sure that a new logical type will catch on to the extent that GeoParquet will, although I'm new to this community and I may be very wrong. The GeoParquet working group is enthusiastic and encodings/strategies for storing/querying geospatial datasets in a data lake context are evolving rapidly. Even though it is a tiny bit of a hack, using extra columns and schema-level metadata to encode these things is very flexible and lets implementations be built on top of a number of underlying readers/underlying versions of the Parquet format.

@wgtmac
Copy link
Member Author

wgtmac commented May 18, 2024

@paleolimbot I'm happy to see the fast evolution of GeoParquet specs. I don't think the addition of geometry type aims to replace or deprecate something from GeoParquet. Instead, GeoParquet can simply ignore the new type as of now, or leverage the built-in bbox if beneficial. For additional (informative) attributes of the geometry type, if some of them are stable and make sense to store them natively into parquet column metadata, then perhaps we can work together to make it happen? I think the main goal of this addition is to enhance interoperability of geospatial data across systems and at the same time it takes little effort to convert to GeoParquet.

@Kontinuation
Copy link
Member

Another thing that comes up frequently is where to put a spatial index (rtree)

I thought this can be something similar to the page index or bloom filter in parquet, which are stored somewhere between row groups or before the footer. It can be row group level or file level as well.

The bounding-box based sort order defined for geometry logical type is already good enough for performing row-level and page-level data skipping. Spatial index such as R-tree may not be suitable for Parquet. I am aware that flatgeobuf has optional static packed Hilbert R-tree index, but for the index to be effective, flatgeobuf supports random access of records and does not support compression. The minimal granularity of reading data in Parquet files is data pages, and the pages are usually compressed so it is impossible to access records within pages randomly.

@paleolimbot
Copy link
Member

I'm happy to see the fast evolution of GeoParquet specs. I don't think the addition of geometry type aims to replace or deprecate something from GeoParquet.

I agree! I think first-class geometry support is great and I'm happy to help wherever I can. I see GeoParquet as a way for existing spatial libraries to leverage Parquet and is not well-suited to Parquet-native things like Iceberg (although others working on GeoParquet may have a different view).

Extension mechanisms are nice because they allow an external community to hash out the discipline-specific details where these evolve at an orthogonal rate to that of the format (e.g., GeoParquet), which generally results in buy-in. I'm not familiar with the speed at which the changes proposed here can evolve (or how long it generally takes readers to implement them), but if @pitrou's suggestion of encoding the type information or statistics in serialized form makes it easier for this to evolve it could provide some of that benefit.

Spatial index such as R-tree may not be suitable for Parquet

I also agree here (but it did come up a lot of times in the discussions around GeoParquet). I think developers of Parquet-native workflows are well aware that there are better formats for random access.

@paleolimbot
Copy link
Member

I think we really need your advise here. If you rethink the design of GeoParquet, how can it do better if parquet format has some geospatial knowledge?

I opened up opengeospatial/geoparquet#222 to collect some thoughts on this...we discussed it at our community call and I think we mostly just never considered that the Parquet standard would be interested in supporting a first-class data type. I've put my thoughts there but I'll let others add their own opinions.

@jorisvandenbossche
Copy link
Member

Just to ensure my understanding is correct:

  • This is proposing to add a new logical type annotating the BYTE_ARRAY physical type. For readers that expect just such a BYTE_ARRAY column (e.g. existing GeoParquet implementations), that is compatible if the column would start having a logical type as well? (although I assume this might depend on how the specific parquet reader implementation deals with an unknown logical type, i.e. error about that or automatically fall back to the physical type).
  • For such "legacy" readers (just reading the WKB values from a binary column), the only thing that actually changes (apart from the logical type annotation) are the values of the statistics? Now, I assume that right now no GeoParquet reader is using the statistics of the binary column, because the physical statistics for BYTE_ARRAY ("unsigned byte-wise comparison") are essentially useless in the case those binary blobs represent WKB geometries. So again that should probably not give any compatibility issues?

@jorisvandenbossche
Copy link
Member

although I assume this might depend on how the specific parquet reader implementation deals with an unknown logical type, i.e. error about that or automatically fall back to the physical type

To answer this part myself, at least for the Parquet C++ implementation, it seems an error is raised for unknown logical types, and it doesn't fall back to the physical type. So that does complicate the compatibility story ..

@wgtmac
Copy link
Member Author

wgtmac commented May 21, 2024

@jorisvandenbossche I think your concern makes sense. It should be a bug if parquet-cpp fails due to unknown logical type and we need to fix that. I also have concern about a new ColumnOrder and need to do some testing. Adding a new logical type should not break anything from legacy readers.

@jornfranke
Copy link

jornfranke commented May 21, 2024

Apache Iceberg is adding geospatial support: https://docs.google.com/document/d/1iVFbrRNEzZl8tDcZC81GFt01QJkLJsI9E2NBOt21IRI. It would be good if Apache Parquet can support geometry type natively.

On the geo integration into Iceberg no one has really worked since some time: apache/iceberg#2586

@szehon-ho
Copy link

On the geo integration into Iceberg no one has really worked since some time: apache/iceberg#2586

Yes there is now a concrete proposal apache/iceberg#10260 , and the plan currently is to bring it up in next community sync

@cholmes
Copy link

cholmes commented May 23, 2024

Thanks for doing this @wgtmac - it's awesome to see this proposal! I helped initiate GeoParquet, and hope we can fully support your effort.

@paleolimbot I'm happy to see the fast evolution of GeoParquet specs. I don't think the addition of geometry type aims to replace or deprecate something from GeoParquet. Instead, GeoParquet can simply ignore the new type as of now, or leverage the built-in bbox if beneficial.

That makes sense, but I think we're also happy to have GeoParquet replaced! As long as it can 'scale up' to meet all the crazy things that hard core geospatial people need, while also being accessible to everyone else. If Parquet had geospatial types from the start we wouldn't have started GeoParquet. We spent a lot of time and effort trying to get the right balance between making it easy to implement for those who don't care about the complexity of geospatial (edges, coordinate reference systems, epochs, winding), while also having the right options to handle it for those who do. My hope has been that the decisions we made there will make it easier to add geospatial support to any new format - like that a 'geo-ORC' could use the same fields and options that we added.

For additional (informative) attributes of the geometry type, if some of them are stable and make sense to store them natively into parquet column metadata, then perhaps we can work together to make it happen? I think the main goal of this addition is to enhance interoperability of geospatial data across systems and at the same time it takes little effort to convert to GeoParquet.

Sounds great! Happy to have GeoParquet be a place to 'try out' things. But I think ideally the surface area of 'GeoParquet' would be very minimal or not even exist, and that Parquet would just be the ideal format to store geospatial data in. And I think if we can align well between this proposal and GeoParquet that should be possible.

wgtmac and others added 13 commits October 13, 2024 11:37
* Add the new suggestion according to the meeting with Snowflake

* Refine the description according to the suggestion
Co-authored-by: Dewey Dunnington <dewey@dunnington.ca>
Co-authored-by: Dewey Dunnington <dewey@dunnington.ca>
Co-authored-by: Dewey Dunnington <dewey@dunnington.ca>
Co-authored-by: Dewey Dunnington <dewey@dunnington.ca>
* Update the spec according to the new feedback

* Fix typo
Co-authored-by: emkornfield <emkornfield@gmail.com>
Co-authored-by: emkornfield <emkornfield@gmail.com>
@wgtmac
Copy link
Member Author

wgtmac commented Oct 13, 2024

Update:

  1. Added a section for GEOMETRY type in LogicalTypes.md as suggested from @gszadovszky. This makes the LogicalTypes.md the single source of truth and greatly simplified addition to the thrift file.
  2. Removed GeometryStatistics from ColumnIndex for now as we don't have any benchmark data to justify it. We can add them in the followup if necessary. cc @rdblue @emkornfield
  3. Added explanation for X/Y/Z/M values in LogicalTypes.md. cc @paleolimbot @emkornfield

@wgtmac
Copy link
Member Author

wgtmac commented Oct 13, 2024

Created two pull requests:

@desruisseaux Thanks for opening two PRs to clarify them! FYI that I have just moved the spec from parquet.thrift to LogicalTypes.md in this PR. I'm not sure if you have time to refactor the above two PRs to reflect this change. Or is it better to discuss these topics in the GeoParquet community which might be a better place?

* GeometryEncoding and Edges are required. CRS is optional.
*
* Once CRS is set, it MUST be a key to an entry in the `key_value_metadata`
* field of `FileMetaData`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is it required that the CRS is embedded in file metadata? Isn't it clear if the CRS is a well-known one like OGC:CRS84? It seems to me that this resolution should be out of scope. Parquet can encourage that the CRS is documented in file metadata, but other systems could store the definition in a different location. For example, Iceberg could store this in a table property instead of in each data file.

I would prefer to define this string property as a "Coordinate reference system identifier" and not specify how to exchange the PROJJSON or other format definition. I would also add a note that people are encouraged to store it in a location along with the file or table metadata.

Copy link
Member

@paleolimbot paleolimbot Oct 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A string property of "Coordinate reference system identifier" (with a convention, either within this spec or outside it, of where in the file to look for the full definition) would allow for enough detail for GeoSpatial libraries to leverage Parquet.

The need for embedding a full CRS description somewhere that is programatically accessible by a Parquet implementation is to ensure a producer's intent can be faithfully transported by the consumer. In the C++ implementation we can attach this as extension type metadata that can pass through a pipeline to a consumer that does not have access to the original context (e.g., constructing a GeoPandas GeoDataFrame from a Parquet file that was read and filtered using a non-spatial tool like pyarrow). If that needs to be an external convention (e.g., one that we define in GeoParquet) to get consensus here that is OK (even though I think it would result in less misinterpreted data to have that convention be in the Parquet specification itself).

Alternatively, would removing any conventions or requirements around the string crs be acceptable? (i.e., the producer puts what it needs to put there to ensure that the coordinates in this column are not misinterpreted by the consumer, which may be an identifier or a full CRS definition according to the requirements of the producer?).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rdblue Although the GeoParquet community would really appreciate the possibility of embedding a full CRS description somewhere in Parquet, we understand that compromises need to be made sometimes.

Like @paleolimbot said, will it be acceptable that if we remove any conventions or requirements around the string crs and only allow this single value in the column metadata?

This means, the writer can put whatever they want but they will need to communicate this to the reader via other channel.

Copy link
Member Author

@wgtmac wgtmac Oct 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The need for embedding a full CRS description somewhere that is programatically accessible by a Parquet implementation is to ensure a producer's intent can be faithfully transported by the consumer.

To achieve this, is it possible to reserve some crs values or at least some prefixes? For example, Iceberg may store iceberg.xxx to crs where xxx is an arbitrary crs identifier defined in its table metadata. Similarly, GeoParquet may set geoparquet.xxx to crs and the key must exist in the Parquet file metadata and its associated value is the full CRS.

This still causes fragmentation but it looks better than a strong enforcement. WDYT? @rdblue @jiayuasu @paleolimbot @szehon-ho

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe iceberg.geo.XXX in table properties and parquet.geo.xxx in parquet file metadata? Not sure if this is allowed though.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is it required that the CRS is embedded in file metadata?

Note that it is not actually required to embed CRS in the file metadata. This is an optional field, and so as a producer of Parquet files with geospatial data, you are not required to fill it. For example Iceberg, assuming it would already be tracking the CRS elsewhere in Iceberg-specific metadata or manifest file, could just leave this field blank in the parquet files itself.

Of course that makes those files less interoperable (but my understanding is that parquet files contained in an Iceberg table generally are not meant to be read by another non-Iceberg aware tool?).
But putting something like iceberg.xxx as crs value would also not be great for interoperability.

@@ -286,6 +313,9 @@ struct Statistics {
7: optional bool is_max_value_exact;
/** If true, min_value is the actual minimum value for a column */
8: optional bool is_min_value_exact;

/** statistics specific to geometry logical type */
9: optional GeometryStatistics geometry_stats;
Copy link
Contributor

@rdblue rdblue Oct 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is the right place to include GeometryStatistics. There are a couple reasons:

  1. This is unnecessary nesting to get to the geo stats, making them harder to find
  2. Nesting within Statistics includes geo stats in places where simple Statistics make sense, but geo stats do not. For example, this would be included in page headers in addition to ColumnMetaData (and the page index already removed these)
  3. This doesn't match the approach used for SizeStatistics, which was included directly in ColumnMetaData

I think that this should match the addition of SizeStatistics and should be included as a field in ColumnMetaData.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. Actually it was a little bit awkward when I did the PoC impl to nest geo stats into the common stats. I have moved it to ColumnMetaData now.

wgtmac and others added 2 commits October 17, 2024 09:15
Co-authored-by: Jia Yu <jiayu@wherobots.com>
@jorisvandenbossche
Copy link
Member

Related to the last major round of updates (summarized at #240 (comment)), considering the change for where to put the CRS description:

Offloaded the CRS representation to Parquet file metadata fields such that multiple geometry columns can refer to the same CRS. This also makes sure that the Parquet spec does not rely on another spec for CRS.

This seems like a very strange way to parameterize the CRS to me that doesn't simplify the specification

I agree with @paleolimbot that this indirection does not simplify the specification, and indeed does let any other discussion point (how should the CRS be described, PROJJSON or something else?) go away.
Could you provide some more rationale for why the spec would choose to put the CRS descriptions in the FileMetaData key_value_metadata, and then per column refer to a key in that metadata?
I understand that if you have many geometry columns with the same CRS that this saves some space, but are there other reasons for this design? (my feeling is that the many-geometry-columns use case is not worth the added complexity)

Comment on lines +843 to +848
CRS (coordinate reference system) is a mapping of how coordinates refer to
precise locations on earth. A CRS is specified by a key-value entry in the
`key_value_metadata` field of `FileMetaData` whose key is a short name of
the CRS and value is the CRS representation. An additional entry in the
`key_value_metadata` field with the suffix ".type" is required to describe
the encoding of this CRS representation.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we are adding to parquet spec proper we should avoid specifying what goes in key_value_metadata. I suggest that this goes in its own field.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a trade-off and still under debate. I don't think this is the final version.


#### Geometry Statistics

`GeometryStatistics` is a struct to store geometry statistics of a column chunk
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why can't the existing statistics fit the purpose? What's special about this type?

Copy link
Member Author

@wgtmac wgtmac Nov 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Existing statistics require to encode geometry data into binary representation, which is non-trivial. We might add more kinds of geometry statistics so it worths a specific type.

Comment on lines +933 to +936
In addition, the following rules are applied:
- A list of multiple values indicates that multiple geometry types are present (e.g. `[0003, 0006]`).
- An empty array explicitly signals that the geometry types are not known.
- The geometry types in the list must be unique (e.g. `[0001, 0001]` is not valid).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the purpose of this list if it is so unspecified? Any conforming writer implementation can choose to never populate it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IIUC, it should be empty if we have any invalid geometry data in the column. ST_IsValid deals with this case: https://postgis.net/docs/ST_IsValid.html.

}

/** Statistics specific to GEOMETRY logical type */
struct GeometryStatistics {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't normal statistics encode a point in WKB in the binary min/max fields that would define a binding box?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can. I think the proposal in Iceberg is doing this. But we have more geometry statistics to support in the future and it is much easier to use if the bbox is explicitly defined. BTW, I don't think min/max values are suitable because geometry type should not be compared and min/max stats require a SortOrder for the logical type.

* coordinates from each axis.
*/
struct BoundingBox {
/** Min X value when edges = PLANAR, westmost value if edges = SPHERICAL */
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Instead of when/if can we stick to one? Either when/when or if/if would be best.

Comment on lines +831 to +832
This value applies to all non-point geometry objects and is independent of the
[Coordinate Reference System](#coordinate-reference-system).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(I work with Salesforce Data Cloud team, and evaluating GeoSpatial support in iceberg)
I am new to geospatial world, and wondering what does it mean for edges to be independent of underlying CRS? Can the edges be planar while the CRS is based on elliptic geometry?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can the edges be planar while the CRS is based on elliptic geometry?

In principle, no. First, talking about "planar edges" or "spherical edges" makes no sense and was a confusion of terms in the initial draft of this specification (the group reached an agreement to fix that in recent talks, I hope it will be done before release). An edge can be a straight line, a curve, a geodesic, etc., but cannot be a plane or a sphere (because of wrong number of dimensions).

What the initial draft intended to say with "planar edges" (sic) is "edges computed as if they were in a planar (two-dimensional Cartesian) coordinate system" (the thing that is planar is the coordinate system, not the edges). This is not really correct for geographic CRS, so you are right to said that they are not really independent. However, while it would be more exact to said that lines on a geographic CRS are geodesics, loxodrome, etc., it happens often that software ignore that physical reality and just perform linear interpolations of latitude and longitude values. The line on the ellipsoid surface obtained that way has no interesting properties, it is just easy to compute. We do not recommend doing that, but the use of "planar" word in this context was an acknowledgement that it happens in practice and an attempt to describe that.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the response. I do not understand what this parameter is used for in parquet. If it is the engine's property to treat the edges, how is this value helping? The engine capable of interpreting edges as geodesics should do so if the CRS reference indicates that the underlying geometry column belongs to an ellipsoid datum. Is this edge property forcing the engine to treat the values in a planar coordinate system?

In other words, is there something intrinsic to the data stored in the parquet file itself where edge parameter makes a difference?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@redblackcoder

The Geo and Iceberg community are discussing the best way to describe this field. It is very likely that we will want to rename edges property to something else because this is not what we want to describe initially. We will post updates in a few days.

Copy link

@mentin mentin Nov 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The engine capable of interpreting edges as geodesics should do so if the CRS reference indicates that the underlying geometry column belongs to an ellipsoid datum.

Consider the most common case, SRID 4326. It is Geographic coordinate system (GEOGCS) rather than Projected one.
https://www.esri.com/arcgis-blog/products/arcgis-pro/mapping/gcs_vs_pcs/

So the linestring from A to B should follow the geodesic line. But most systems treat 4326 as planar map. E.g. with Geometry type in PostGIS or MS SQL Server, they treat it as projected coordinate system, and the linestrings follow straight lines on flat surface. If you use latest MySQL or Geography type in PostGIS or MS SQL Server, the linestrings in 4326 follow geodesic lines on sphere. So there is ambiguity what exactly a linestring or polygon in 4326 describes. Is 'point(30 21) inside polygon((10 10, 50 10, 50 20, 10 20, 10 10))?

With geometry, in PostGIS, returns false:

select st_intersects(
  st_geomfromtext('polygon((10 10, 50 10, 50 20, 10 20, 10 10))', 4326), 
  st_geomfromtext('point(30 21)', 4326));

Same thing with geography (4326 is presumed), returns true:

 select st_intersects(
  st_geographyfromtext('srid=4326;polygon((10 10, 50 10, 50 20, 10 20, 10 10))'),
  st_geographyfromtext('srid=4326;point(30 21)'));

Unfortunately, there is no accepted way to describe the difference between geometry and geography in WKB format. You can encounter SRID=4326 with both interpretations. The edge attribute allows describing the difference between geometry and geography, and tells user how to interpret the data in a way consistent with the system that produced it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.