Skip to content

Commit

Permalink
feat(spanner): add support for Proto Columns in Connection API (#3123)
Browse files Browse the repository at this point in the history
* feat: Support for Proto Messages & Enums (#2155)

* feat: Importing Proto Changes

Commit will be reverted, once PROTO changes are available publicly.

* feat: Proto Message Implementation

* feat: Adding support for enum

* feat: Code refactoring

Adding default implementation for newly added methods
ByteArray compatability changes for Proto Messages

* docs: Adding Java docs for all the newly added methods.

* test: Sample Proto & Generated classes for unit test

* feat: Adding bytes/proto & int64/enum compatability

Adding Additional check for ChecksumResultSet

* test: Adding unit tests

* test: Adding unit tests for ValueBinder.java

* feat: refactoring to add support for getValue & other minor changes

* feat: Minor refactoring

1. Adding docs and formatting the code.
2. Adding additional methods for enum and message which accepts descriptors.

* feat: Adding bytes/message & int64/enum compatability in Value

* refactor: Minor refactoring

* feat: Adding Proto Array Implementation

* test: Implementing unit tests for array of protos and enums

* refactor: adding clirr ignores

* feat: Adding support for enum as Primary Key

* feat: Code Review Changes, minor refactoring and adding docs

* feat: Addressing review comments

-Modified Docs/Comments
-Minor Refactoring

* refactor: Using Column instead of column to avoid test failures

* feat: Minor refactoring

-code review comments
-adding function docs

* samples: Adding samples for updating & querying Proto messages & enums (#2211)

* samples: Adding samples for updating & querying Proto messages & enums

* style: linting

* style: linting

* docs: Adding function and class doc

* test: Proto Column Integration tests (#2212)

* test: Adding Integration tests for Proto Messages & Enums

* test: Adding additional test for Parameterized Queries, Primary Keys & Invalid Wire type errors.

* style: Formatting

* style: Formatting

* test: Updating instance and db name

* test: Adding inter compatability check while writing data

* Configured jitpack.yml to use OpenJDK 11 (#2218)

Co-authored-by: Pavol Juhos <pjuhos@google.com>

* feat: add support for Proto Columns DDL (#2277)

* feat: add code changes and tests for Proto columns DDL support

* feat: add auto generated code

* feat: code changes and tests for Proto columns DDL support

* feat: add descriptors file

* feat: code refactoring

* feat: Integration tests and code refactoring

* feat: code refactoring

* feat: unit tests and clirr differences

* feat: lint changes

* feat: code refactor

* feat: code refactoring

* feat: code refactoring

* feat: code refactoring

* feat: add java docs to new methods

* feat: lint formatting

* feat: lint formatting changes

* feat: lint formatting

* feat: lint formatting

* feat: test exception cases

* feat: code refactoring

* feat: add java docs and refactoring

* feat: add java docs

* feat: java docs refactor

* feat: remove overload method setProtoDescriptors that accepts file path as input to avoid unexpected issues

* feat: remove updateDdl method overload to update proto descriptor

* teat: update pom file to run tests on cloud-devel region temporarily to validate main branch update

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* fix: revert host changes in pom.xml file

* feat: add support for Proto Columns to Connection API

* feat: add client side statement support for proto descriptor

* feat: update existing unit tests to include proto descriptorss argument

* feat: code refactor for client side statements

* feat: add unit tests

* feat: code refactoring for file path client statement

* test: add integration test for DDL

* fix: comment refactoring

* feat: move proto descriptor file read logic to ConnectionImpl file to fix autogenerated client side statement tests

* feat: add autogenerated test for new proto columns client side statements

* feat: add unit tests to verify proto descriptor set via filepath

* feat: add review comments

* feat: add client side statement to show proto descriptors file path

* fix: remove proto descriptors file extension validation

* feat: comment refactor

* feat: address review comments

* feat: update tests and revert autogenerated code

* chore: revert autogenerated coee

* chore: revert autogenerated code

* chore: revert autogenerated code

* chore: lint format

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* chore: regenerate descriptors file

* chore: update schema

* chore: update base64 value for protodescriptor

* chore: update copyright year

---------

Co-authored-by: Gaurav Purohit <gauravpurohit@google.com>
Co-authored-by: Pavol Juhos <pjuhos@google.com>
Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
  • Loading branch information
4 people committed May 24, 2024
1 parent ae93c57 commit 7e7c814
Show file tree
Hide file tree
Showing 23 changed files with 1,587 additions and 49 deletions.
28 changes: 26 additions & 2 deletions google-cloud-spanner/clirr-ignored-differences.xml
Original file line number Diff line number Diff line change
Expand Up @@ -337,6 +337,12 @@
<className>com/google/cloud/spanner/spi/v1/GapicSpannerRpc</className>
<method>com.google.cloud.spanner.spi.v1.SpannerRpc$StreamingCall read(com.google.spanner.v1.ReadRequest, com.google.cloud.spanner.spi.v1.SpannerRpc$ResultStreamConsumer, java.util.Map)</method>
</difference>
<difference>
<differenceType>7005</differenceType>
<className>com/google/cloud/spanner/spi/v1/GapicSpannerRpc</className>
<method>com.google.api.gax.longrunning.OperationFuture updateDatabaseDdl(java.lang.String, java.lang.Iterable, java.lang.String)</method>
<to>com.google.api.gax.longrunning.OperationFuture updateDatabaseDdl(com.google.cloud.spanner.Database, java.lang.Iterable, java.lang.String)</to>
</difference>
<difference>
<differenceType>7005</differenceType>
<className>com/google/cloud/spanner/spi/v1/GapicSpannerRpc</className>
Expand Down Expand Up @@ -387,6 +393,12 @@
<className>com/google/cloud/spanner/spi/v1/SpannerRpc</className>
<method>com.google.cloud.spanner.spi.v1.SpannerRpc$StreamingCall read(com.google.spanner.v1.ReadRequest, com.google.cloud.spanner.spi.v1.SpannerRpc$ResultStreamConsumer, java.util.Map)</method>
</difference>
<difference>
<differenceType>7005</differenceType>
<className>com/google/cloud/spanner/spi/v1/SpannerRpc</className>
<method>com.google.api.gax.longrunning.OperationFuture updateDatabaseDdl(java.lang.String, java.lang.Iterable, java.lang.String)</method>
<to>com.google.api.gax.longrunning.OperationFuture updateDatabaseDdl(com.google.cloud.spanner.Database, java.lang.Iterable, java.lang.String)</to>
</difference>
<difference>
<differenceType>7005</differenceType>
<className>com/google/cloud/spanner/spi/v1/SpannerRpc</className>
Expand Down Expand Up @@ -656,7 +668,7 @@
<className>com/google/cloud/spanner/connection/Connection</className>
<method>com.google.cloud.spanner.Spanner getSpanner()</method>
</difference>

<!-- Add DdlInTransactionMode -->
<difference>
<differenceType>7012</differenceType>
Expand All @@ -668,7 +680,7 @@
<className>com/google/cloud/spanner/connection/Connection</className>
<method>com.google.cloud.spanner.connection.DdlInTransactionMode getDdlInTransactionMode()</method>
</difference>

<!-- Added extended tracing option -->
<difference>
<differenceType>7012</differenceType>
Expand All @@ -688,4 +700,16 @@
<method>void setExcludeTxnFromChangeStreams(boolean)</method>
</difference>

<!-- Added Proto descriptors for proto columns -->
<difference>
<differenceType>7012</differenceType>
<className>com/google/cloud/spanner/connection/Connection</className>
<method>byte[] getProtoDescriptors()</method>
</difference>
<difference>
<differenceType>7012</differenceType>
<className>com/google/cloud/spanner/connection/Connection</className>
<method>void setProtoDescriptors(byte[])</method>
</difference>

</differences>
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ public static Type pgOid() {
/**
* To get the descriptor for the {@code PROTO} type.
*
* @param protoTypeFqn Proto fully qualified name (ex: "spanner.examples.music.SingerInfo").
* @param protoTypeFqn Proto fully qualified name (ex: "examples.spanner.music.SingerInfo").
*/
public static Type proto(String protoTypeFqn) {
return new Type(Code.PROTO, protoTypeFqn);
Expand All @@ -156,7 +156,7 @@ public static Type proto(String protoTypeFqn) {
/**
* To get the descriptor for the {@code ENUM} type.
*
* @param protoTypeFqn Proto ENUM fully qualified name (ex: "spanner.examples.music.Genre")
* @param protoTypeFqn Proto ENUM fully qualified name (ex: "examples.spanner.music.Genre")
*/
public static Type protoEnum(String protoTypeFqn) {
return new Type(Code.ENUM, protoTypeFqn);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,12 @@
import com.google.cloud.spanner.connection.PgTransactionMode.IsolationLevel;
import com.google.common.base.Function;
import com.google.common.base.Preconditions;
import com.google.common.base.Strings;
import com.google.protobuf.Duration;
import com.google.protobuf.util.Durations;
import com.google.spanner.v1.DirectedReadOptions;
import com.google.spanner.v1.RequestOptions.Priority;
import java.util.Base64;
import java.util.EnumSet;
import java.util.HashMap;
import java.util.Locale;
Expand Down Expand Up @@ -563,4 +565,46 @@ public String convert(String value) {
return value.substring(7).trim();
}
}

/** Converter for converting Base64 encoded string to byte[] */
static class ProtoDescriptorsConverter implements ClientSideStatementValueConverter<byte[]> {

public ProtoDescriptorsConverter(String allowedValues) {}

@Override
public Class<byte[]> getParameterClass() {
return byte[].class;
}

@Override
public byte[] convert(String value) {
if (value == null || value.length() == 0 || value.equalsIgnoreCase("null")) {
return null;
}
try {
return Base64.getDecoder().decode(value);
} catch (IllegalArgumentException e) {
return null;
}
}
}

/** Converter for converting String that take in file path as input to String */
static class ProtoDescriptorsFileConverter implements ClientSideStatementValueConverter<String> {

public ProtoDescriptorsFileConverter(String allowedValues) {}

@Override
public Class<String> getParameterClass() {
return String.class;
}

@Override
public String convert(String filePath) {
if (Strings.isNullOrEmpty(filePath)) {
return null;
}
return filePath;
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@
import java.util.Set;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.TimeUnit;
import javax.annotation.Nonnull;

/**
* Internal connection API for Google Cloud Spanner. This interface may introduce breaking changes
Expand Down Expand Up @@ -403,6 +404,25 @@ default boolean isExcludeTxnFromChangeStreams() {
throw new UnsupportedOperationException();
}

/**
* Sets the proto descriptors to use for the next DDL statement (single or batch) that will be
* executed. The proto descriptor is automatically cleared after the statement is executed.
*
* @param protoDescriptors The proto descriptors to use with the next DDL statement (single or
* batch) that will be executed on this connection.
*/
default void setProtoDescriptors(@Nonnull byte[] protoDescriptors) {
throw new UnsupportedOperationException();
}

/**
* @return The proto descriptor that will be used with the next DDL statement (single or batch)
* that is executed on this connection.
*/
default byte[] getProtoDescriptors() {
throw new UnsupportedOperationException();
}

/**
* @return <code>true</code> if this connection will automatically retry read/write transactions
* that abort. This method may only be called when the connection is in read/write
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import com.google.api.core.ApiFuture;
import com.google.api.core.ApiFutures;
import com.google.api.gax.core.GaxProperties;
import com.google.cloud.ByteArray;
import com.google.cloud.Timestamp;
import com.google.cloud.spanner.AsyncResultSet;
import com.google.cloud.spanner.BatchClient;
Expand Down Expand Up @@ -65,6 +66,9 @@
import io.opentelemetry.api.common.AttributesBuilder;
import io.opentelemetry.api.trace.Span;
import io.opentelemetry.api.trace.Tracer;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.time.Duration;
import java.util.ArrayList;
import java.util.Arrays;
Expand All @@ -81,6 +85,7 @@
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.stream.Collectors;
import javax.annotation.Nonnull;
import javax.annotation.Nullable;
import org.threeten.bp.Instant;

Expand Down Expand Up @@ -113,7 +118,7 @@ private LeakedConnectionException() {
}
}

private volatile LeakedConnectionException leakedException;;
private volatile LeakedConnectionException leakedException;
private final SpannerPool spannerPool;
private AbstractStatementParser statementParser;
/**
Expand Down Expand Up @@ -268,10 +273,11 @@ static UnitOfWorkType of(TransactionMode transactionMode) {

private String transactionTag;
private String statementTag;

private boolean excludeTxnFromChangeStreams;

private Duration maxCommitDelay;
private byte[] protoDescriptors;
private String protoDescriptorsFilePath;

/** Create a connection and register it in the SpannerPool. */
ConnectionImpl(ConnectionOptions options) {
Expand Down Expand Up @@ -353,6 +359,7 @@ public Spanner getSpanner() {
private DdlClient createDdlClient() {
return DdlClient.newBuilder()
.setDatabaseAdminClient(spanner.getDatabaseAdminClient())
.setProjectId(options.getProjectId())
.setInstanceId(options.getInstanceId())
.setDatabaseName(options.getDatabaseName())
.build();
Expand Down Expand Up @@ -763,6 +770,52 @@ public void setExcludeTxnFromChangeStreams(boolean excludeTxnFromChangeStreams)
this.excludeTxnFromChangeStreams = excludeTxnFromChangeStreams;
}

@Override
public byte[] getProtoDescriptors() {
ConnectionPreconditions.checkState(!isClosed(), CLOSED_ERROR_MSG);
if (this.protoDescriptors == null && this.protoDescriptorsFilePath != null) {
// Read from file if filepath is valid
try {
File protoDescriptorsFile = new File(this.protoDescriptorsFilePath);
if (!protoDescriptorsFile.isFile()) {
throw SpannerExceptionFactory.newSpannerException(
ErrorCode.INVALID_ARGUMENT,
String.format(
"File %s is not a valid proto descriptors file", this.protoDescriptorsFilePath));
}
InputStream pdStream = new FileInputStream(protoDescriptorsFile);
this.protoDescriptors = ByteArray.copyFrom(pdStream).toByteArray();
} catch (Exception exception) {
throw SpannerExceptionFactory.newSpannerException(exception);
}
}
return this.protoDescriptors;
}

@Override
public void setProtoDescriptors(@Nonnull byte[] protoDescriptors) {
Preconditions.checkNotNull(protoDescriptors);
ConnectionPreconditions.checkState(!isClosed(), CLOSED_ERROR_MSG);
ConnectionPreconditions.checkState(
!isBatchActive(), "Proto descriptors cannot be set when a batch is active");
this.protoDescriptors = protoDescriptors;
this.protoDescriptorsFilePath = null;
}

void setProtoDescriptorsFilePath(@Nonnull String protoDescriptorsFilePath) {
Preconditions.checkNotNull(protoDescriptorsFilePath);
ConnectionPreconditions.checkState(!isClosed(), CLOSED_ERROR_MSG);
ConnectionPreconditions.checkState(
!isBatchActive(), "Proto descriptors file path cannot be set when a batch is active");
this.protoDescriptorsFilePath = protoDescriptorsFilePath;
this.protoDescriptors = null;
}

String getProtoDescriptorsFilePath() {
ConnectionPreconditions.checkState(!isClosed(), CLOSED_ERROR_MSG);
return this.protoDescriptorsFilePath;
}

/**
* Throws an {@link SpannerException} with code {@link ErrorCode#FAILED_PRECONDITION} if the
* current state of this connection does not allow changing the setting for retryAbortsInternally.
Expand Down Expand Up @@ -1806,6 +1859,7 @@ UnitOfWork createNewUnitOfWork(
.setSpan(
createSpanForUnitOfWork(
statementType == StatementType.DDL ? DDL_STATEMENT : SINGLE_USE_TRANSACTION))
.setProtoDescriptors(getProtoDescriptors())
.build();
if (!isInternalMetadataQuery && !forceSingleUse) {
// Reset the transaction options after starting a single-use transaction.
Expand Down Expand Up @@ -1862,6 +1916,7 @@ UnitOfWork createNewUnitOfWork(
.setStatementTimeout(statementTimeout)
.withStatementExecutor(statementExecutor)
.setSpan(createSpanForUnitOfWork(DDL_BATCH))
.setProtoDescriptors(getProtoDescriptors())
.build();
default:
}
Expand All @@ -1885,7 +1940,11 @@ private void popUnitOfWorkFromTransactionStack() {
}

private ApiFuture<Void> executeDdlAsync(CallType callType, ParsedStatement ddl) {
return getOrStartDdlUnitOfWork().executeDdlAsync(callType, ddl);
ApiFuture<Void> result = getOrStartDdlUnitOfWork().executeDdlAsync(callType, ddl);
// reset proto descriptors after executing a DDL statement
this.protoDescriptors = null;
this.protoDescriptorsFilePath = null;
return result;
}

@Override
Expand Down Expand Up @@ -1985,6 +2044,11 @@ public ApiFuture<long[]> runBatchAsync() {
}
return ApiFutures.immediateFuture(new long[0]);
} finally {
if (isDdlBatchActive()) {
// reset proto descriptors after executing a DDL batch
this.protoDescriptors = null;
this.protoDescriptorsFilePath = null;
}
this.batchMode = BatchMode.NONE;
setDefaultTransactionOptions();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -138,6 +138,14 @@ StatementResult statementSetPgSessionCharacteristicsTransactionMode(

StatementResult statementShowTransactionIsolationLevel();

StatementResult statementSetProtoDescriptors(byte[] protoDescriptors);

StatementResult statementSetProtoDescriptorsFilePath(String filePath);

StatementResult statementShowProtoDescriptors();

StatementResult statementShowProtoDescriptorsFilePath();

StatementResult statementExplain(String sql);

StatementResult statementShowDataBoostEnabled();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,8 @@
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_MAX_PARTITIONS;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_OPTIMIZER_STATISTICS_PACKAGE;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_OPTIMIZER_VERSION;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_PROTO_DESCRIPTORS;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_PROTO_DESCRIPTORS_FILE_PATH;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_READONLY;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_READ_ONLY_STALENESS;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SET_RETRY_ABORTS_INTERNALLY;
Expand All @@ -59,6 +61,8 @@
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_MAX_PARTITIONS;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_OPTIMIZER_STATISTICS_PACKAGE;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_OPTIMIZER_VERSION;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_PROTO_DESCRIPTORS;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_PROTO_DESCRIPTORS_FILE_PATH;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_READONLY;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_READ_ONLY_STALENESS;
import static com.google.cloud.spanner.connection.StatementResult.ClientSideStatementType.SHOW_READ_TIMESTAMP;
Expand Down Expand Up @@ -637,6 +641,36 @@ public StatementResult statementRunPartitionedQuery(Statement statement) {
ClientSideStatementType.RUN_PARTITIONED_QUERY);
}

@Override
public StatementResult statementSetProtoDescriptors(byte[] protoDescriptors) {
Preconditions.checkNotNull(protoDescriptors);
getConnection().setProtoDescriptors(protoDescriptors);
return noResult(SET_PROTO_DESCRIPTORS);
}

@Override
public StatementResult statementSetProtoDescriptorsFilePath(String filePath) {
Preconditions.checkNotNull(filePath);
getConnection().setProtoDescriptorsFilePath(filePath);
return noResult(SET_PROTO_DESCRIPTORS_FILE_PATH);
}

@Override
public StatementResult statementShowProtoDescriptors() {
return resultSet(
String.format("%sPROTO_DESCRIPTORS", getNamespace(connection.getDialect())),
getConnection().getProtoDescriptors(),
SHOW_PROTO_DESCRIPTORS);
}

@Override
public StatementResult statementShowProtoDescriptorsFilePath() {
return resultSet(
String.format("%sPROTO_DESCRIPTORS_FILE_PATH", getNamespace(connection.getDialect())),
getConnection().getProtoDescriptorsFilePath(),
SHOW_PROTO_DESCRIPTORS_FILE_PATH);
}

private String processQueryPlan(PlanNode planNode) {
StringBuilder planNodeDescription = new StringBuilder(" : { ");
com.google.protobuf.Struct metadata = planNode.getMetadata();
Expand Down
Loading

0 comments on commit 7e7c814

Please sign in to comment.