Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sql support for dynamic parameters #6974

Merged
merged 45 commits into from
Feb 19, 2020
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
0cc2c66
sql support for dynamic parameters
clintropolis Jan 31, 2019
3c98981
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 3, 2019
f668777
fixup
clintropolis Feb 3, 2019
bcce2be
javadocs
clintropolis Feb 4, 2019
33c1894
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 15, 2019
568364d
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 27, 2019
be84ac8
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Mar 11, 2019
3bedbd9
fixup from merge
clintropolis Mar 11, 2019
17d32aa
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Mar 19, 2019
a1038bc
formatting
clintropolis Mar 19, 2019
37668fc
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Apr 5, 2019
62ac051
fixes
clintropolis Apr 5, 2019
ed23a8c
fix it
clintropolis Apr 5, 2019
c6306ef
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Apr 22, 2019
88f6084
doc fix
clintropolis Apr 22, 2019
2c00301
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis May 10, 2019
57b1342
remove druid fallback self-join parameterized test
clintropolis May 10, 2019
c7968d8
unused imports
clintropolis May 10, 2019
27cebc6
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis May 31, 2019
01fcf4e
ignore test for now
clintropolis May 31, 2019
a7075be
fix imports
clintropolis May 31, 2019
3c50d9b
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Jun 24, 2019
5ea94c4
fixup
clintropolis Jun 25, 2019
7b554c0
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Jul 9, 2019
ed4a40f
fix merge
clintropolis Jul 9, 2019
e018a55
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Jul 12, 2019
6b4175f
merge fixup
clintropolis Jul 12, 2019
53b155f
fix test that cannot vectorize
clintropolis Jul 13, 2019
059320f
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Jan 23, 2020
9ff9667
fixup and more better
clintropolis Jan 23, 2020
97ea9e5
dependency thingo
clintropolis Jan 24, 2020
ee5b46d
fix docs
clintropolis Jan 24, 2020
01cbeae
tweaks
clintropolis Jan 24, 2020
6e670d2
fix docs
clintropolis Jan 24, 2020
c773e21
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Jan 29, 2020
aacae42
spelling
clintropolis Jan 29, 2020
d069f83
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 5, 2020
682833c
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 5, 2020
8987d1b
unused imports after merge
clintropolis Feb 6, 2020
2819431
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 16, 2020
e1dceef
Merge remote-tracking branch 'upstream/master' into sql-parameters
clintropolis Feb 17, 2020
8bdc691
review stuffs
clintropolis Feb 18, 2020
d967e12
add comment
clintropolis Feb 18, 2020
8f669ee
add ignore text
clintropolis Feb 18, 2020
6279270
review stuffs
clintropolis Feb 19, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/querying/sql.md
Original file line number Diff line number Diff line change
Expand Up @@ -641,7 +641,7 @@ Properties connectionProperties = new Properties();
try (Connection connection = DriverManager.getConnection(url, connectionProperties)) {
try (
final Statement statement = connection.createStatement();
final ResultSet resultSet = statement.executeQuery(query);
final ResultSet resultSet = statement.executeQuery(query)
) {
while (resultSet.next()) {
// Do something
Expand Down Expand Up @@ -671,7 +671,7 @@ Parameterized queries are supported with JDBC:
PreparedStatement statement = connection.prepareStatement("SELECT COUNT(*) AS cnt FROM druid.foo WHERE dim1 = ? OR dim1 = ?");
statement.setString(1, "abc");
statement.setString(2, "def");
final ResultSet resultSet = statement.executeQuery(query);
final ResultSet resultSet = statement.executeQuery();
```

### Connection context
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@
import org.apache.druid.sql.calcite.util.CalciteTests;
import org.apache.druid.sql.calcite.util.QueryLogHook;
import org.apache.druid.sql.http.SqlParameter;
import org.junit.Ignore;
import org.junit.Rule;
import org.junit.Test;

Expand Down Expand Up @@ -275,6 +276,7 @@ public void testBloomFilters() throws Exception
);
}

@Ignore
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add an explanation of why it's ignored (the annotation takes a value, which can be used for the comment).

@Test
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Per #6974 (comment), it'd be good to ignore this slow test

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Marked both as ignored

public void testBloomFilterBigNoParam() throws Exception
{
Expand Down Expand Up @@ -302,6 +304,7 @@ public void testBloomFilterBigNoParam() throws Exception
);
}

@Ignore
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment about adding explanation

@Test
public void testBloomFilterBigParameter() throws Exception
{
Expand Down Expand Up @@ -330,6 +333,34 @@ public void testBloomFilterBigParameter() throws Exception
);
}

@Test
public void testBloomFilterNullParameter() throws Exception
{
BloomKFilter filter = new BloomKFilter(1500);
filter.addBytes(null, 0, 0);
byte[] bytes = BloomFilterSerializersModule.bloomKFilterToBytes(filter);
String base64 = StringUtils.encodeBase64String(bytes);

// bloom filter expression is evaluated and optimized out at planning time since parameter is null and null matches
// the supplied filter of the other parameter
testQuery(
"SELECT COUNT(*) FROM druid.foo WHERE bloom_filter_test(?, ?)",
ImmutableList.of(
Druids.newTimeseriesQueryBuilder()
.dataSource(CalciteTests.DATASOURCE1)
.intervals(querySegmentSpec(Filtration.eternity()))
.granularity(Granularities.ALL)
.aggregators(aggregators(new CountAggregatorFactory("a0")))
.context(TIMESERIES_CONTEXT_DEFAULT)
.build()
),
ImmutableList.of(
new Object[]{6L}
),
// there are no empty strings in the druid expression language since empty is coerced into a null when parsed
ImmutableList.of(new SqlParameter(SqlType.VARCHAR, NullHandling.defaultStringValue()), new SqlParameter(SqlType.VARCHAR, base64))
);
}

@Override
public List<Object[]> getResults(
Expand Down
5 changes: 5 additions & 0 deletions sql/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -221,6 +221,11 @@
<artifactId>hamcrest-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<build>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ public DruidStatement prepare(
PrepareResult prepareResult = sqlLifecycle.prepare(authenticationResult);
this.maxRowCount = maxRowCount;
this.query = query;
ArrayList<AvaticaParameter> params = new ArrayList<>();
List<AvaticaParameter> params = new ArrayList<>();
final RelDataType parameterRowType = prepareResult.getParameterRowType();
for (RelDataTypeField field : parameterRowType.getFieldList()) {
RelDataType type = field.getType();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -405,6 +405,7 @@ public static int collapseFetch(int innerFetch, int outerFetch, int outerOffset)

public static Class<?> sqlTypeNameJdbcToJavaClass(SqlTypeName typeName)
{
// reference: https://docs.oracle.com/javase/1.5.0/docs/guide/jdbc/getstart/mapping.html
JDBCType jdbcType = JDBCType.valueOf(typeName.getJdbcOrdinal());
switch (jdbcType) {
case CHAR:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
import org.apache.calcite.sql.SqlCall;
import org.apache.calcite.sql.SqlKind;
import org.apache.calcite.sql.SqlOperator;
import org.apache.calcite.sql.fun.OracleSqlOperatorTable;
import org.apache.calcite.sql.fun.SqlLibraryOperators;
import org.apache.calcite.sql.fun.SqlStdOperatorTable;
import org.apache.calcite.sql2rel.SqlRexContext;
import org.apache.calcite.sql2rel.SqlRexConvertlet;
Expand Down Expand Up @@ -70,7 +70,7 @@ public class DruidConvertletTable implements SqlRexConvertletTable
.add(SqlStdOperatorTable.UNION_ALL)
.add(SqlStdOperatorTable.NULLIF)
.add(SqlStdOperatorTable.COALESCE)
.add(OracleSqlOperatorTable.NVL)
.add(SqlLibraryOperators.NVL)
.build();

private final Map<SqlOperator, SqlRexConvertlet> table;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,9 @@ private RexNode bind(RexNode node, RexBuilder builder, RelDataTypeFactory typeFa
// if we have a value for dynamic parameter, replace with a literal, else add to list of unbound parameters
if (plannerContext.getParameters().size() > dynamicParam.getIndex()) {
TypedValue param = plannerContext.getParameters().get(dynamicParam.getIndex());
if (param.value == null) {
return builder.makeNullLiteral(typeFactory.createSqlType(SqlTypeName.NULL));
}
SqlTypeName typeName = SqlTypeName.getNameForJdbcType(param.type.typeId);
return builder.makeLiteral(
param.value,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,9 @@ public SqlNode visit(SqlDynamicParam param)
try {
if (plannerContext.getParameters().size() > param.getIndex()) {
TypedValue paramBinding = plannerContext.getParameters().get(param.getIndex());
if (paramBinding.value == null) {
return SqlLiteral.createNull(param.getParserPosition());
}
SqlTypeName typeName = SqlTypeName.getNameForJdbcType(paramBinding.type.typeId);
if (SqlTypeName.APPROX_TYPES.contains(typeName)) {
return SqlLiteral.createApproxNumeric(paramBinding.value.toString(), param.getParserPosition());
Expand All @@ -62,6 +65,7 @@ public SqlNode visit(SqlDynamicParam param)
param.getParserPosition()
);
}

return typeName.createLiteral(paramBinding.value, param.getParserPosition());
}
}
Expand Down
6 changes: 3 additions & 3 deletions sql/src/main/java/org/apache/druid/sql/http/SqlParameter.java
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
import org.apache.calcite.util.TimestampString;
import org.apache.druid.java.util.common.DateTimes;

import javax.annotation.Nullable;
import java.sql.Date;
import java.util.Objects;

Expand All @@ -41,11 +42,11 @@ public class SqlParameter
@JsonCreator
public SqlParameter(
@JsonProperty("type") SqlType type,
@JsonProperty("value") Object value
@JsonProperty("value") @Nullable Object value
)
{
this.type = Preconditions.checkNotNull(type);
this.value = Preconditions.checkNotNull(value);
this.value = value;
}

@JsonProperty
Expand All @@ -63,7 +64,6 @@ public SqlType getType()
@JsonIgnore
public TypedValue getTypedValue()
{

Object adjustedValue = value;

// perhaps there is a better way to do this?
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -613,4 +613,82 @@ public void testWrongTypeParameter() throws Exception
ImmutableList.of(new SqlParameter(SqlType.BIGINT, 3L), new SqlParameter(SqlType.VARCHAR, "wat"))
);
}

@Test
public void testNullParameter() throws Exception
{
// contrived example of using null as an sql parameter to at least test the codepath because lots of things dont
// actually work as null and things like 'IS NULL' fail to parse in calcite if expressed as 'IS ?'
cannotVectorize();

// this will optimize out the 3rd argument because 2nd argument will be constant and not null
testQuery(
"SELECT COALESCE(dim2, ?, ?), COUNT(*) FROM druid.foo GROUP BY 1\n",
ImmutableList.of(
GroupByQuery.builder()
.setDataSource(CalciteTests.DATASOURCE1)
.setInterval(querySegmentSpec(Filtration.eternity()))
.setGranularity(Granularities.ALL)
.setVirtualColumns(
expressionVirtualColumn(
"v0",
"case_searched(notnull(\"dim2\"),\"dim2\",'parameter')",
ValueType.STRING
)
)
.setDimensions(dimensions(new DefaultDimensionSpec("v0", "v0", ValueType.STRING)))
.setAggregatorSpecs(aggregators(new CountAggregatorFactory("a0")))
.setContext(QUERY_CONTEXT_DEFAULT)
.build()
),
NullHandling.replaceWithDefault() ?
ImmutableList.of(
new Object[]{"a", 2L},
new Object[]{"abc", 1L},
new Object[]{"parameter", 3L}
) :
ImmutableList.of(
new Object[]{"", 1L},
new Object[]{"a", 2L},
new Object[]{"abc", 1L},
new Object[]{"parameter", 2L}
),
ImmutableList.of(new SqlParameter(SqlType.VARCHAR, "parameter"), new SqlParameter(SqlType.VARCHAR, null))
);

// when converting to rel expression, this will optimize out 2nd argument to coalesce which is null
testQuery(
"SELECT COALESCE(dim2, ?, ?), COUNT(*) FROM druid.foo GROUP BY 1\n",
ImmutableList.of(
GroupByQuery.builder()
.setDataSource(CalciteTests.DATASOURCE1)
.setInterval(querySegmentSpec(Filtration.eternity()))
.setGranularity(Granularities.ALL)
.setVirtualColumns(
expressionVirtualColumn(
"v0",
"case_searched(notnull(\"dim2\"),\"dim2\",'parameter')",
ValueType.STRING
)
)
.setDimensions(dimensions(new DefaultDimensionSpec("v0", "v0", ValueType.STRING)))
.setAggregatorSpecs(aggregators(new CountAggregatorFactory("a0")))
.setContext(QUERY_CONTEXT_DEFAULT)
.build()
),
NullHandling.replaceWithDefault() ?
ImmutableList.of(
new Object[]{"a", 2L},
new Object[]{"abc", 1L},
new Object[]{"parameter", 3L}
) :
ImmutableList.of(
new Object[]{"", 1L},
new Object[]{"a", 2L},
new Object[]{"abc", 1L},
new Object[]{"parameter", 2L}
),
ImmutableList.of(new SqlParameter(SqlType.VARCHAR, null), new SqlParameter(SqlType.VARCHAR, "parameter"))
);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import nl.jqno.equalsverifier.EqualsVerifier;
import org.apache.calcite.avatica.SqlType;
import org.apache.druid.segment.TestHelper;
import org.apache.druid.sql.calcite.util.CalciteTestBase;
Expand All @@ -46,4 +47,11 @@ public void testSerde() throws Exception
);
Assert.assertEquals(query, jsonMapper.readValue(jsonMapper.writeValueAsString(query), SqlQuery.class));
}

@Test
public void testEquals()
{
EqualsVerifier.forClass(SqlQuery.class).withNonnullFields("query").usingGetClass().verify();
EqualsVerifier.forClass(SqlParameter.class).withNonnullFields("type").usingGetClass().verify();
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import com.google.common.util.concurrent.ListenableFuture;
import com.google.inject.Guice;
import com.google.inject.Injector;
Expand Down Expand Up @@ -1063,7 +1064,7 @@ private static class FakeDruidNodeDiscovery implements DruidNodeDiscovery

FakeDruidNodeDiscovery(Map<NodeRole, DruidNode> nodes)
{
this.nodes = new HashSet<>(nodes.size());
this.nodes = Sets.newHashSetWithExpectedSize(nodes.size());
nodes.forEach((k, v) -> {
addNode(v, k);
});
Expand Down