You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apparently we don't have a test for this in core, but python-mapnik has a failing test exhibiting it (select from empty table, for which mapnik cannot guess the extent).
ERROR: "179769313486231600000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" is out of range for type double precision
SELECT1.797693134862316e+308::double precision;
Postgres parses 1.797693134862316e+308 as a numeric (arbitrary precision), and when it tries to pass that to a function accepting double precision, the number falls on the wrong side of [-DBL_MAX, DBL_MAX].
This started being an issue after #3942. That "increase" in precision was ill-informed, back then I didn't realize PostGIS does bbox intersections in single precision (which totally makes sense, storing double precision bboxes in index would be a waste).
The obvious and simple fix would be not putting the intersection test in SQL if query extent is unbounded.
However, I'll need to emit bbox even if unbounded for an upcoming feature, so what I suggest is some magic in constructing the SQL:
clamp bbox coordinates to single precision range [-FLT_MAX, FLT_MAX]
convert to single precision
nudge the values that fell inside the original bbox (making the converted smaller) with nextafter.
output either with default precision (6), or as exact hex-float (which Postgres supports by virtue of using strtof); either way force cast to ::real
The text was updated successfully, but these errors were encountered:
Apparently we don't have a test for this in core, but python-mapnik has a failing test exhibiting it (select from empty table, for which mapnik cannot guess the extent).
Here's what's going on:
ERROR: "179769313486231600000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" is out of range for type double precision
Postgres parses
1.797693134862316e+308
as anumeric
(arbitrary precision), and when it tries to pass that to a function acceptingdouble precision
, the number falls on the wrong side of[-DBL_MAX, DBL_MAX]
.This started being an issue after #3942. That "increase" in precision was ill-informed, back then I didn't realize PostGIS does bbox intersections in single precision (which totally makes sense, storing double precision bboxes in index would be a waste).
The obvious and simple fix would be not putting the intersection test in SQL if query extent is unbounded.
However, I'll need to emit bbox even if unbounded for an upcoming feature, so what I suggest is some magic in constructing the SQL:
[-FLT_MAX, FLT_MAX]
strtof
); either way force cast to::real
The text was updated successfully, but these errors were encountered: