Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecate validate_* and normalize_* #10248

Closed
wants to merge 3 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
12 changes: 0 additions & 12 deletions docs/reference/mapping/types/geo-point-type.asciidoc
Expand Up @@ -143,21 +143,9 @@ longitude (default is `true`). *Note*: Validation only works when
normalization has been disabled. This option will be deprecated and removed
in upcoming releases.

|`validate_lat` |Set to `false` to accept geo points with an invalid
latitude (default is `true`). This option will be deprecated and removed
in upcoming releases.

|`validate_lon` |Set to `false` to accept geo points with an invalid
longitude (default is `true`). This option will be deprecated and removed
in upcoming releases.

|`normalize` |Set to `true` to normalize latitude and longitude (default
is `true`).

|`normalize_lat` |Set to `true` to normalize latitude.

|`normalize_lon` |Set to `true` to normalize longitude.

|`precision_step` |The precision step (influences the number of terms
generated for each number value) for `.lat` and `.lon` fields
if `lat_lon` is set to `true`.
Expand Down
27 changes: 27 additions & 0 deletions docs/reference/query-dsl/filters/geo-bounding-box-filter.asciidoc
Expand Up @@ -44,6 +44,33 @@ Then the following simple query can be executed with a
}
--------------------------------------------------

[float]
==== Filter Options

[cols="<,<",options="header",]
|=======================================================================
|Option |Description
|`_name` |Optional name field to identify the filter

|`_cache` |Set to `true` to cache the *result* of the filter.
Defaults to `false`. See <<Caching,Caching>> below for further details.

|`_cache_key` |Associate a unique custom cache key to use instead of
the actual filter.

|`validate` |Set to `false` to accept geo points with invalid latitude or
longitude (default is `true`). *Note*: Validation only works when
normalization has been disabled. This option will be deprecated and removed
in upcoming releases.

|`normalize` |Set to `true` to normalize latitude and longitude (default
is `true`).

|`type` |Set to one of `indexed` or `memory` to defines whether this filter will
be executed in memory or indexed. See <<Type,Type>> below for further details
Default is `memory`.
|=======================================================================

[float]
==== Accepted Formats

Expand Down
23 changes: 23 additions & 0 deletions docs/reference/query-dsl/filters/geo-distance-filter.asciidoc
Expand Up @@ -158,6 +158,29 @@ The following are options allowed on the filter:
sure the `geo_point` type index lat lon in this case), or `none` which
disables bounding box optimization.

`_name`::

Optional name field to identify the filter

`_cache`::

Set to `true` to cache the *result* of the filter.
Defaults to `false`. See <<Caching,Caching>> below for further details.

`_cache_key`::

Associate a unique custom cache key to use instead of the actual filter.

`validate`::

Set to `false` to accept geo points with invalid latitude or
longitude (default is `true`). *Note*: Validation only works when
normalization has been disabled. This option will be deprecated and removed
in upcoming releases.

`normalize`::

Set to `true` to normalize latitude and longitude. Defaults to `true`.

[float]
==== geo_point Type
Expand Down
Expand Up @@ -24,7 +24,7 @@ Filters documents that exists within a range from a specific point:
}
--------------------------------------------------

Supports the same point location parameter as the
Supports the same point location parameter and filter options as the
<<query-dsl-geo-distance-filter,geo_distance>>
filter. And also support the common parameters for range (lt, lte, gt,
gte, from, to, include_upper and include_lower).
24 changes: 24 additions & 0 deletions docs/reference/query-dsl/filters/geo-polygon-filter.asciidoc
Expand Up @@ -26,6 +26,30 @@ points. Here is an example:
}
--------------------------------------------------

[float]
==== Filter Options

[cols="<,<",options="header",]
|=======================================================================
|Option |Description
|`_name` |Optional name field to identify the filter

|`_cache` |Set to `true` to cache the *result* of the filter.
Defaults to `false`. See <<Caching,Caching>> below.

|`_cache_key` |Associate a unique custom cache key to use instead of
the actual filter.

|`validate` |Set to `false` to accept geo points with invalid latitude or
longitude (default is `true`). *Note*: Validation only works when
normalization has been disabled. This option will be deprecated and removed
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"will be deprecated"? Should this already be marked deprecated? I don't think we need to explicitly say "removed in upcoming release" because this is implied by deprecation?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this PR validate_lat and validate_lon are deprecated in favor of validate. As a separate issue I'm looking at deprecating validate altogether.

in upcoming releases.

|`normalize` |Set to `false` to prevent latitude and longitude from auto converting
to proper lat/lon bounds.
Default is `true'.
|=======================================================================

[float]
==== Allowed Formats

Expand Down
Expand Up @@ -105,10 +105,13 @@ public static class Defaults {
public static final boolean ENABLE_GEOHASH = false;
public static final boolean ENABLE_GEOHASH_PREFIX = false;
public static final int GEO_HASH_PRECISION = GeoHashUtils.PRECISION;
public static final boolean BACK_COMPAT = true;
public static final boolean NORMALIZE_LAT = true;
public static final boolean NORMALIZE_LON = true;
public static final boolean VALIDATE_LAT = true;
public static final boolean VALIDATE_LON = true;
public static final boolean NORMALIZE = true;
public static final boolean VALIDATE = true;

public static final FieldType FIELD_TYPE = new FieldType(StringFieldMapper.Defaults.FIELD_TYPE);

Expand All @@ -134,10 +137,13 @@ public static class Builder extends AbstractFieldMapper.Builder<Builder, GeoPoin

private int geoHashPrecision = Defaults.GEO_HASH_PRECISION;

boolean backCompat = Defaults.BACK_COMPAT;
boolean validateLat = Defaults.VALIDATE_LAT;
boolean validateLon = Defaults.VALIDATE_LON;
boolean normalizeLat = Defaults.NORMALIZE_LAT;
boolean normalizeLon = Defaults.NORMALIZE_LON;
boolean validate = Defaults.VALIDATE;
boolean normalize = Defaults.NORMALIZE;

public Builder(String name) {
super(name, new FieldType(Defaults.FIELD_TYPE));
Expand Down Expand Up @@ -214,15 +220,16 @@ public GeoPointFieldMapper build(BuilderContext context) {

return new GeoPointFieldMapper(buildNames(context), fieldType, docValues, indexAnalyzer, searchAnalyzer,
similarity, fieldDataSettings, context.indexSettings(), origPathType, enableLatLon, enableGeoHash, enableGeohashPrefix, precisionStep,
geoHashPrecision, latMapper, lonMapper, geohashMapper, validateLon, validateLat, normalizeLon, normalizeLat
, multiFieldsBuilder.build(this, context));
geoHashPrecision, latMapper, lonMapper, geohashMapper, backCompat, validateLat, validateLon, validate, normalizeLat,
normalizeLon, normalize, multiFieldsBuilder.build(this, context));
}
}

public static class TypeParser implements Mapper.TypeParser {
@Override
public Mapper.Builder<?, ?> parse(String name, Map<String, Object> node, ParserContext parserContext) throws MapperParsingException {
Builder builder = geoPointField(name);
builder.backCompat = parserContext.indexVersionCreated().before(Version.V_1_6_0);
parseField(builder, name, node, parserContext);
for (Iterator<Map.Entry<String, Object>> iterator = node.entrySet().iterator(); iterator.hasNext();) {
Map.Entry<String, Object> entry = iterator.next();
Expand Down Expand Up @@ -254,23 +261,29 @@ public static class TypeParser implements Mapper.TypeParser {
}
iterator.remove();
} else if (fieldName.equals("validate")) {
builder.validateLat = XContentMapValues.nodeBooleanValue(fieldNode);
builder.validateLon = XContentMapValues.nodeBooleanValue(fieldNode);
builder.validate = XContentMapValues.nodeBooleanValue(fieldNode);
if (builder.backCompat) {
builder.validateLat = builder.validate;
builder.validateLon = builder.validate;
}
iterator.remove();
} else if (fieldName.equals("validate_lon")) {
} else if (builder.backCompat && fieldName.equals("validate_lon")) {
builder.validate = false;
builder.validateLon = XContentMapValues.nodeBooleanValue(fieldNode);
iterator.remove();
} else if (fieldName.equals("validate_lat")) {
} else if (builder.backCompat && fieldName.equals("validate_lat")) {
builder.validate = false;
builder.validateLat = XContentMapValues.nodeBooleanValue(fieldNode);
iterator.remove();
} else if (fieldName.equals("normalize")) {
builder.normalizeLat = XContentMapValues.nodeBooleanValue(fieldNode);
builder.normalizeLon = XContentMapValues.nodeBooleanValue(fieldNode);
builder.normalize = XContentMapValues.nodeBooleanValue(fieldNode);
iterator.remove();
} else if (fieldName.equals("normalize_lat")) {
} else if (builder.backCompat && fieldName.equals("normalize_lat")) {
builder.normalize = false;
builder.normalizeLat = XContentMapValues.nodeBooleanValue(fieldNode);
iterator.remove();
} else if (fieldName.equals("normalize_lon")) {
} else if (builder.backCompat && fieldName.equals("normalize_lon")) {
builder.normalize = false;
builder.normalizeLon = XContentMapValues.nodeBooleanValue(fieldNode);
iterator.remove();
} else if (parseMultiField(builder, name, parserContext, fieldName, fieldNode)) {
Expand Down Expand Up @@ -424,19 +437,21 @@ public GeoPoint decode(long latBits, long lonBits, GeoPoint out) {

private final StringFieldMapper geohashMapper;

private boolean validateLon;
private boolean backCompat;
private boolean validateLat;

private final boolean normalizeLon;
private final boolean normalizeLat;
private boolean validateLon;
private boolean normalizeLat;
private boolean normalizeLon;
private boolean validate;
private boolean normalize;

public GeoPointFieldMapper(FieldMapper.Names names, FieldType fieldType, Boolean docValues,
NamedAnalyzer indexAnalyzer, NamedAnalyzer searchAnalyzer,
SimilarityProvider similarity, @Nullable Settings fieldDataSettings, Settings indexSettings,
ContentPath.Type pathType, boolean enableLatLon, boolean enableGeoHash, boolean enableGeohashPrefix, Integer precisionStep, int geoHashPrecision,
DoubleFieldMapper latMapper, DoubleFieldMapper lonMapper, StringFieldMapper geohashMapper,
boolean validateLon, boolean validateLat,
boolean normalizeLon, boolean normalizeLat, MultiFields multiFields) {
DoubleFieldMapper latMapper, DoubleFieldMapper lonMapper, StringFieldMapper geohashMapper, boolean backCompat,
boolean validateLat, boolean validateLon, boolean validate, boolean normalizeLat, boolean normalizeLon, boolean normalize,
MultiFields multiFields) {
super(names, 1f, fieldType, docValues, null, indexAnalyzer, similarity, null, fieldDataSettings, indexSettings, multiFields, null);
this.pathType = pathType;
this.enableLatLon = enableLatLon;
Expand All @@ -449,11 +464,15 @@ public GeoPointFieldMapper(FieldMapper.Names names, FieldType fieldType, Boolean
this.lonMapper = lonMapper;
this.geohashMapper = geohashMapper;

this.backCompat = backCompat;
this.validateLat = validateLat;
this.validateLon = validateLon;

this.normalizeLat = normalizeLat;
this.normalizeLon = normalizeLon;

this.validate = validate;
this.normalize = normalize;
}

@Override
Expand Down Expand Up @@ -587,17 +606,14 @@ private void parsePointFromString(ParseContext context, GeoPoint sparse, String
}

private void parse(ParseContext context, GeoPoint point, String geohash) throws IOException {
if (normalizeLat || normalizeLon) {
GeoUtils.normalizePoint(point, normalizeLat, normalizeLon);
}

if (validateLat) {
if (point.lat() > 90.0 || point.lat() < -90.0) {
// if the coordinate is normalized there is no need for a validation check
if (backCompat || normalize) {
GeoUtils.normalizePoint(point, (backCompat) ? normalizeLat : normalize, (backCompat) ? normalizeLon : normalize);
} else if (backCompat || validate) {
if ((validate || (backCompat && validateLat)) && (point.lat() > 90.0 || point.lat() < -90.0)) {
throw new ElasticsearchIllegalArgumentException("illegal latitude value [" + point.lat() + "] for " + name());
}
}
if (validateLon) {
if (point.lon() > 180.0 || point.lon() < -180) {
if ((validate || (backCompat && validateLon)) && (point.lon() > 180.0 || point.lon() < -180)) {
throw new ElasticsearchIllegalArgumentException("illegal longitude value [" + point.lon() + "] for " + name());
}
}
Expand Down Expand Up @@ -662,20 +678,14 @@ public void merge(Mapper mergeWith, MergeContext mergeContext) throws MergeMappi
if (this.enableGeohashPrefix != fieldMergeWith.enableGeohashPrefix) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different geohash_prefix");
}
if (this.normalizeLat != fieldMergeWith.normalizeLat) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different normalize_lat");
}
if (this.normalizeLon != fieldMergeWith.normalizeLon) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different normalize_lon");
if (this.normalize != fieldMergeWith.normalize) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different normalize");
}
if (!Objects.equal(this.precisionStep, fieldMergeWith.precisionStep)) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different precision_step");
}
if (this.validateLat != fieldMergeWith.validateLat) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different validate_lat");
}
if (this.validateLon != fieldMergeWith.validateLon) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different validate_lon");
if (this.validate != fieldMergeWith.validate) {
mergeContext.addConflict("mapper [" + names.fullName() + "] has different validate");
}
}

Expand Down Expand Up @@ -716,33 +726,11 @@ protected void doXContentBody(XContentBuilder builder, boolean includeDefaults,
if (includeDefaults || precisionStep != null) {
builder.field("precision_step", precisionStep);
}
if (includeDefaults || validateLat != Defaults.VALIDATE_LAT || validateLon != Defaults.VALIDATE_LON) {
if (validateLat && validateLon) {
builder.field("validate", true);
} else if (!validateLat && !validateLon) {
builder.field("validate", false);
} else {
if (includeDefaults || validateLat != Defaults.VALIDATE_LAT) {
builder.field("validate_lat", validateLat);
}
if (includeDefaults || validateLon != Defaults.VALIDATE_LON) {
builder.field("validate_lon", validateLon);
}
}
if (includeDefaults || validate != Defaults.VALIDATE) {
builder.field("validate", validate);
}
if (includeDefaults || normalizeLat != Defaults.NORMALIZE_LAT || normalizeLon != Defaults.NORMALIZE_LON) {
if (normalizeLat && normalizeLon) {
builder.field("normalize", true);
} else if (!normalizeLat && !normalizeLon) {
builder.field("normalize", false);
} else {
if (includeDefaults || normalizeLat != Defaults.NORMALIZE_LAT) {
builder.field("normalize_lat", normalizeLat);
}
if (includeDefaults || normalizeLon != Defaults.NORMALIZE_LON) {
builder.field("normalize_lon", normalizeLat);
}
}
if (includeDefaults || normalize != Defaults.NORMALIZE) {
builder.field("normalize", normalize);
}
}

Expand Down Expand Up @@ -779,5 +767,4 @@ public BytesRef binaryValue() {
return new BytesRef(bytes);
}
}

}
Expand Up @@ -84,6 +84,7 @@ public Filter parse(QueryParseContext parseContext) throws IOException, QueryPar
String filterName = null;
String currentFieldName = null;
XContentParser.Token token;
boolean validate = true;
boolean normalize = true;

GeoPoint sparse = new GeoPoint();
Expand Down Expand Up @@ -142,6 +143,8 @@ public Filter parse(QueryParseContext parseContext) throws IOException, QueryPar
cache = parseContext.parseFilterCachePolicy();
} else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
cacheKey = new HashedBytesRef(parser.text());
} else if ("validate".equals(currentFieldName)) {
validate = parser.booleanValue();
} else if ("normalize".equals(currentFieldName)) {
normalize = parser.booleanValue();
} else if ("type".equals(currentFieldName)) {
Expand All @@ -156,15 +159,30 @@ public Filter parse(QueryParseContext parseContext) throws IOException, QueryPar
final GeoPoint bottomRight = new GeoPoint(bottom, right);

if (normalize) {
// Special case: if the difference bettween the left and right is 360 and the right is greater than the left, we are asking for
// the complete longitude range so need to set longitude to the complete longditude range
// Special case: if the difference between the left and right is 360 and the right is greater than the left, we are asking for
// the complete longitude range so need to set longitude to the complete longitude range
boolean completeLonRange = ((right - left) % 360 == 0 && right > left);
GeoUtils.normalizePoint(topLeft, true, !completeLonRange);
GeoUtils.normalizePoint(bottomRight, true, !completeLonRange);
if (completeLonRange) {
topLeft.resetLon(-180);
bottomRight.resetLon(180);
}
} else if (validate) {
if (topLeft.lat() > 90.0 || topLeft.lat() < -90.0) {
throw new QueryParsingException(parseContext.index(), "illegal latitude value [" + topLeft.lat() + "] for " + filterName);
}
if (topLeft.lon() > 180.0 || topLeft.lon() < -180) {
throw new QueryParsingException(parseContext.index(), "illegal longitude value [" + topLeft.lon() + "] for " + filterName);
}
if (bottomRight.lat() > 90.0 || bottomRight.lat() < -90.0) {
throw new QueryParsingException(parseContext.index(), "illegal latitude value [" + bottomRight.lat() + "] for " +
filterName);
}
if (bottomRight.lon() > 180.0 || bottomRight.lon() < -180) {
throw new QueryParsingException(parseContext.index(), "illegal longitude value [" + bottomRight.lon() + "] for " +
filterName);
}
}

MapperService.SmartNameFieldMappers smartMappers = parseContext.smartFieldMappers(fieldName);
Expand Down