Skip to content
This repository has been archived by the owner on Aug 2, 2022. It is now read-only.

Commit

Permalink
Merge branch 'develop' into staging-workflows
Browse files Browse the repository at this point in the history
  • Loading branch information
peterzhuamazon committed Feb 1, 2021
2 parents ab089e4 + 1dbf244 commit 3616e97
Show file tree
Hide file tree
Showing 99 changed files with 2,158 additions and 334 deletions.
3 changes: 3 additions & 0 deletions .github/workflows/sql-cli-release-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,9 @@ jobs:
tarball=`ls ./dist/*.tar.gz`
wheel=`ls ./dist/*.whl`
renamed_wheel=`echo $wheel | sed 's/_/-/g'`
mv "$wheel" "$renamed_wheel"
# Inject the build number before the suffix
tarball_outfile=`basename ${tarball%.tar.gz}-build-${GITHUB_RUN_NUMBER}.tar.gz`
wheel_outfile=`basename ${wheel%.whl}-build-${GITHUB_RUN_NUMBER}.whl`
Expand Down
20 changes: 16 additions & 4 deletions .github/workflows/sql-odbc-release-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ env:
ODBC_BIN_PATH: "./build/odbc/bin"
ODBC_BUILD_PATH: "./build/odbc/build"
AWS_SDK_INSTALL_PATH: "./build/aws-sdk/install"
PLUGIN_NAME: opendistro-sql-odbc
OD_VERSION: 1.12.0.0

jobs:
build-mac:
Expand Down Expand Up @@ -79,7 +81,10 @@ jobs:
if: success()
run: |
cd installer
pkg=`ls -1t *.pkg | grep "macos" | head -1`
pkg=`ls -1t *.pkg | grep "Open Distro for Elasticsearch SQL ODBC Driver" | head -1`
mv "$pkg" "${{ env.PLUGIN_NAME }}-${{ env.OD_VERSION }}-macos-x64.pkg"
pkg=`ls -1t *.pkg | grep "${{ env.PLUGIN_NAME }}-${{ env.OD_VERSION }}-macos-x64.pkg" | head -1`
# Inject the build number before the suffix
pkg_outfile=`basename ${pkg%.pkg}-build-${GITHUB_RUN_NUMBER}.pkg`
Expand Down Expand Up @@ -134,7 +139,10 @@ jobs:
shell: bash
run: |
cd ci-output/installer
msi=`ls -1t *.msi | grep "x86" | head -1`
msi=`ls -1t *.msi | grep "Open Distro for Elasticsearch SQL ODBC Driver" | head -1`
mv "$msi" "${{ env.PLUGIN_NAME }}-${{ env.OD_VERSION }}-windows-x86.msi"
msi=`ls -1t *.msi | grep "${{ env.PLUGIN_NAME }}-${{ env.OD_VERSION }}-windows-x86.msi" | head -1`
# Inject the build number before the suffix
msi_outfile=`basename ${msi%.msi}-build-${GITHUB_RUN_NUMBER}.msi`
Expand Down Expand Up @@ -189,12 +197,16 @@ jobs:
shell: bash
run: |
cd ci-output/installer
msi=`ls -1t *.msi | grep "x64" | head -1`
msi=`ls -1t *.msi | grep "Open Distro for Elasticsearch SQL ODBC Driver" | head -1`
mv "$msi" "${{ env.PLUGIN_NAME }}-${{ env.OD_VERSION }}-windows-x64.msi"
msi=`ls -1t *.msi | grep "${{ env.PLUGIN_NAME }}-${{ env.OD_VERSION }}-windows-x64.msi" | head -1`
# Inject the build number before the suffix
msi_outfile=`basename ${msi%.msi}-build-${GITHUB_RUN_NUMBER}.msi`
s3_prefix="s3://staging.artifacts.opendistroforelasticsearch.amazon.com/snapshots/elasticsearch-clients/sql-odbc/"
echo "Copying ${msi} to ${s3_prefix}${msi_outfile}"
aws s3 cp --quiet $msi ${s3_prefix}${msi_outfile}
18 changes: 1 addition & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,23 +28,7 @@ Please refer to the [SQL Language Reference Manual](./docs/user/index.rst), [Pip

## Experimental

Recently we have been actively improving our query engine primarily for better correctness and extensibility. The new enhanced query engine has been already supporting the new released Piped Processing Language query processing behind the scene. Meanwhile, the integration with SQL language is also under way. To try out the power of the new query engine with SQL, simply run the command to enable it by [plugin setting](https://github.com/opendistro-for-elasticsearch/sql/blob/develop/docs/user/admin/settings.rst#opendistro-sql-engine-new-enabled). In future release, this will be enabled by default and nothing required to do from your side. Please stay tuned for updates on our progress and its new exciting features.

Here is a documentation list with features only available in this improved SQL query engine. Please follow the instruction above to enable it before trying out example queries in these docs:

* [Identifiers](./docs/user/general/identifiers.rst): support for identifier names with special characters
* [Data types](./docs/user/general/datatypes.rst): new data types such as date time and interval
* [Expressions](./docs/user/dql/expressions.rst): new expression system that can represent and evaluate complex expressions
* [SQL functions](./docs/user/dql/functions.rst): many more string and date functions added
* [Basic queries](./docs/user/dql/basics.rst)
* Ordering by Aggregate Functions section
* NULLS FIRST/LAST in section Specifying Order for Null
* [Aggregations](./docs/user/dql/aggregations.rst): aggregation over expression and more other features
* [Complex queries](./docs/user/dql/complex.rst)
* Improvement on Subqueries in FROM clause
* [Window functions](./docs/user/dql/window.rst): ranking and aggregate window function support

To avoid impact on your side, normally you won't see any difference in query response. If you want to check if and why your query falls back to be handled by old SQL engine, please explain your query and check Elasticsearch log for "Request is falling back to old SQL engine due to ...".
Recently we have been actively improving our query engine primarily for better correctness and extensibility. Behind the scene, the new enhanced engine has already supported the new released Piped Processing Language. However, it was experimental and disabled by default for SQL query processing. With most important features and full testing complete, now we're ready to promote it as our default SQL query engine. Please find more details in [An Introduction to the New SQL Query Engine](/docs/dev/NewSQLEngine.md).


## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -259,9 +259,9 @@ private Expression visitIdentifier(String ident, AnalysisContext context) {
return ref;
}

// Array type is not supporte yet.
private boolean isTypeNotSupported(ExprType type) {
return "struct".equalsIgnoreCase(type.typeName())
|| "array".equalsIgnoreCase(type.typeName());
return "array".equalsIgnoreCase(type.typeName());
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -51,21 +51,34 @@ public String unqualified(QualifiedName fullName) {
private boolean isQualifierIndexOrAlias(QualifiedName fullName) {
Optional<String> qualifier = fullName.first();
if (qualifier.isPresent()) {
if (isFieldName(qualifier.get())) {
return false;
}
resolveQualifierSymbol(fullName, qualifier.get());
return true;
}
return false;
}

private boolean isFieldName(String qualifier) {
try {
// Resolve the qualifier in Namespace.FIELD_NAME
context.peek().resolve(new Symbol(Namespace.FIELD_NAME, qualifier));
return true;
} catch (SemanticCheckException e2) {
return false;
}
}

private void resolveQualifierSymbol(QualifiedName fullName, String qualifier) {
try {
context.peek().resolve(new Symbol(Namespace.INDEX_NAME, qualifier));
} catch (SemanticCheckException e) {
// Throw syntax check intentionally to indicate fall back to old engine.
// Need change to semantic check exception in future.
throw new SyntaxCheckException(String.format(
"The qualifier [%s] of qualified name [%s] must be an index name or its alias",
qualifier, fullName));
"The qualifier [%s] of qualified name [%s] must be an field name, index name or its "
+ "alias", qualifier, fullName));
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ public UnresolvedPlan values(List<Literal>... values) {
return new Values(Arrays.asList(values));
}

public static UnresolvedExpression qualifiedName(String... parts) {
public static QualifiedName qualifiedName(String... parts) {
return new QualifiedName(Arrays.asList(parts));
}

Expand Down Expand Up @@ -178,7 +178,7 @@ public static Literal nullLiteral() {
}

public static Map map(String origin, String target) {
return new Map(new Field(origin), new Field(target));
return new Map(field(origin), field(target));
}

public static Map map(UnresolvedExpression origin, UnresolvedExpression target) {
Expand Down Expand Up @@ -281,27 +281,27 @@ public AllFields allFields() {
}

public Field field(UnresolvedExpression field) {
return new Field((QualifiedName) field);
}

public Field field(String field) {
return new Field(field);
}

public Field field(UnresolvedExpression field, Argument... fieldArgs) {
return new Field(field, Arrays.asList(fieldArgs));
return field(field, Arrays.asList(fieldArgs));
}

public Field field(String field) {
return field(qualifiedName(field));
}

public Field field(String field, Argument... fieldArgs) {
return new Field(field, Arrays.asList(fieldArgs));
return field(field, Arrays.asList(fieldArgs));
}

public Field field(UnresolvedExpression field, List<Argument> fieldArgs) {
return new Field(field, fieldArgs);
}

public Field field(String field, List<Argument> fieldArgs) {
return new Field(field, fieldArgs);
return field(qualifiedName(field), fieldArgs);
}

public Alias alias(String name, UnresolvedExpression expr) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
@RequiredArgsConstructor
public enum DataType {
TYPE_ERROR(ExprCoreType.UNKNOWN),
NULL(ExprCoreType.UNKNOWN),
NULL(ExprCoreType.UNDEFINED),

INTEGER(ExprCoreType.INTEGER),
LONG(ExprCoreType.LONG),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,21 +27,24 @@
@Getter
@ToString
@EqualsAndHashCode(callSuper = false)
@AllArgsConstructor
public class Field extends UnresolvedExpression {
private UnresolvedExpression field;
private List<Argument> fieldArgs = Collections.emptyList();

public Field(QualifiedName field) {
this.field = field;
}
private final UnresolvedExpression field;

public Field(String field) {
this.field = new QualifiedName(field);
private final List<Argument> fieldArgs;

/**
* Constructor of Field.
*/
public Field(UnresolvedExpression field) {
this(field, Collections.emptyList());
}

public Field(String field, List<Argument> fieldArgs) {
this.field = new QualifiedName(field);
/**
* Constructor of Field.
*/
public Field(UnresolvedExpression field, List<Argument> fieldArgs) {
this.field = field;
this.fieldArgs = fieldArgs;
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,22 +26,37 @@
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeFormatterBuilder;
import java.time.format.DateTimeParseException;
import java.time.temporal.ChronoField;
import java.time.temporal.ChronoUnit;
import lombok.RequiredArgsConstructor;

@RequiredArgsConstructor
public class ExprDatetimeValue extends AbstractExprValue {
private static final DateTimeFormatter formatter = DateTimeFormatter
.ofPattern("yyyy-MM-dd HH:mm:ss[.SSSSSS]");
private final LocalDateTime datetime;

private static final DateTimeFormatter FORMATTER_VARIABLE_MICROS;
private static final int MIN_FRACTION_SECONDS = 0;
private static final int MAX_FRACTION_SECONDS = 6;

static {
FORMATTER_VARIABLE_MICROS = new DateTimeFormatterBuilder()
.appendPattern("yyyy-MM-dd HH:mm:ss")
.appendFraction(
ChronoField.MICRO_OF_SECOND,
MIN_FRACTION_SECONDS,
MAX_FRACTION_SECONDS,
true)
.toFormatter();
}

/**
* Constructor with datetime string as input.
*/
public ExprDatetimeValue(String datetime) {
try {
this.datetime = LocalDateTime.parse(datetime, formatter);
this.datetime = LocalDateTime.parse(datetime, FORMATTER_VARIABLE_MICROS);
} catch (DateTimeParseException e) {
throw new SemanticCheckException(String.format("datetime:%s in unsupported format, please "
+ "use yyyy-MM-dd HH:mm:ss[.SSSSSS]", datetime));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@

import com.amazon.opendistroforelasticsearch.sql.data.type.ExprCoreType;
import com.amazon.opendistroforelasticsearch.sql.data.type.ExprType;
import com.amazon.opendistroforelasticsearch.sql.exception.ExpressionEvaluationException;
import java.util.Objects;

/**
Expand All @@ -40,7 +39,7 @@ public Object value() {

@Override
public ExprType type() {
return ExprCoreType.UNKNOWN;
return ExprCoreType.UNDEFINED;
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ public Object value() {

@Override
public ExprType type() {
return ExprCoreType.UNKNOWN;
return ExprCoreType.UNDEFINED;
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@
import com.amazon.opendistroforelasticsearch.sql.data.type.ExprType;
import com.amazon.opendistroforelasticsearch.sql.exception.SemanticCheckException;
import java.time.LocalTime;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeFormatterBuilder;
import java.time.format.DateTimeParseException;
import java.time.temporal.ChronoField;
import java.util.Objects;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;

/**
Expand All @@ -35,15 +35,30 @@
public class ExprTimeValue extends AbstractExprValue {
private final LocalTime time;

private static final DateTimeFormatter FORMATTER_VARIABLE_MICROS;
private static final int MIN_FRACTION_SECONDS = 0;
private static final int MAX_FRACTION_SECONDS = 6;

static {
FORMATTER_VARIABLE_MICROS = new DateTimeFormatterBuilder()
.appendPattern("HH:mm:ss")
.appendFraction(
ChronoField.MICRO_OF_SECOND,
MIN_FRACTION_SECONDS,
MAX_FRACTION_SECONDS,
true)
.toFormatter();
}

/**
* Constructor.
*/
public ExprTimeValue(String time) {
try {
this.time = LocalTime.parse(time);
this.time = LocalTime.parse(time, FORMATTER_VARIABLE_MICROS);
} catch (DateTimeParseException e) {
throw new SemanticCheckException(String.format("time:%s in unsupported format, please use "
+ "HH:mm:ss", time));
+ "HH:mm:ss[.SSSSSS]", time));
}
}

Expand Down
Loading

0 comments on commit 3616e97

Please sign in to comment.