Skip to content

Commit

Permalink
Fix merge conflict
Browse files Browse the repository at this point in the history
  • Loading branch information
matthew29tang committed Apr 24, 2024
2 parents 3a055e9 + f34094b commit ac88c1b
Show file tree
Hide file tree
Showing 87 changed files with 5,580 additions and 355 deletions.
6 changes: 5 additions & 1 deletion .github/sync-repo-settings.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,8 @@ branchProtectionRules:
- 'Presubmit - Unit Tests Python 3.11'
- 'Presubmit - Unit Tests Python 3.12'
- 'Presubmit - Unit Tests Ray 2.4.0'
- 'Presubmit - Unit Tests Ray 2.9.3'
- 'Presubmit - Unit Tests Ray 2.9.3'
- 'Presubmit - Unit Tests LangChain (Python 3.8)'
- 'Presubmit - Unit Tests LangChain (Python 3.9)'
- 'Presubmit - Unit Tests LangChain (Python 3.10)'
- 'Presubmit - Unit Tests LangChain (Python 3.11)'
13 changes: 13 additions & 0 deletions .kokoro/presubmit/unit_langchain_py310.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Format: //devtools/kokoro/config/proto/build.proto

# Run unit tests for LangChain on Python 3.10
env_vars: {
key: "NOX_SESSION"
value: "unit_langchain-3.10"
}

# Run unit tests in parallel, splitting up by file
env_vars: {
key: "PYTEST_ADDOPTS"
value: "-n=auto --dist=loadscope"
}
13 changes: 13 additions & 0 deletions .kokoro/presubmit/unit_langchain_py311.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Format: //devtools/kokoro/config/proto/build.proto

# Run unit tests for LangChain on Python 3.11
env_vars: {
key: "NOX_SESSION"
value: "unit_langchain-3.11"
}

# Run unit tests in parallel, splitting up by file
env_vars: {
key: "PYTEST_ADDOPTS"
value: "-n=auto --dist=loadscope"
}
13 changes: 13 additions & 0 deletions .kokoro/presubmit/unit_langchain_py38.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Format: //devtools/kokoro/config/proto/build.proto

# Run unit tests for LangChain on Python 3.8
env_vars: {
key: "NOX_SESSION"
value: "unit_langchain-3.8"
}

# Run unit tests in parallel, splitting up by file
env_vars: {
key: "PYTEST_ADDOPTS"
value: "-n=auto --dist=loadscope"
}
13 changes: 13 additions & 0 deletions .kokoro/presubmit/unit_langchain_py39.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Format: //devtools/kokoro/config/proto/build.proto

# Run unit tests for LangChain on Python 3.9
env_vars: {
key: "NOX_SESSION"
value: "unit_langchain-3.9"
}

# Run unit tests in parallel, splitting up by file
env_vars: {
key: "PYTEST_ADDOPTS"
value: "-n=auto --dist=loadscope"
}
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "1.47.0"
".": "1.48.0"
}
27 changes: 27 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,32 @@
# Changelog

## [1.48.0](https://github.com/googleapis/python-aiplatform/compare/v1.47.0...v1.48.0) (2024-04-17)


### Features

* Add support for reading requirements from a file. ([80db7a0](https://github.com/googleapis/python-aiplatform/commit/80db7a0960b80ae0d78182687c1e99db696943f7))
* Adding tpu_topology to Vertex SDK ([423c764](https://github.com/googleapis/python-aiplatform/commit/423c7646185b4df19985fb41f5776557d572dd9f))
* Enable continuous upload for profile logs. ([f05924d](https://github.com/googleapis/python-aiplatform/commit/f05924d6bbd9e609f4ca98cdef7ab5a504672e58))
* GenAI - Added the `GenerationResponse.prompt_feedback` property ([efd5a72](https://github.com/googleapis/python-aiplatform/commit/efd5a72c1856a6767bdbbba9ea83f366518bdac2))
* GenAI - Added the `GenerationResponse.usage_metadata` property ([0654c35](https://github.com/googleapis/python-aiplatform/commit/0654c3504425d9f9bba6e3be919026229b616ec0))
* Support `NOT_EQUAL` for `MatchingEngineIndexEndpoint` `numeric_restricts`. ([aa918e3](https://github.com/googleapis/python-aiplatform/commit/aa918e31fcc40878e9f29affa02a4527d90188aa))
* Support referenced models in SDK. ([c9b6b8b](https://github.com/googleapis/python-aiplatform/commit/c9b6b8b3433854afd95a27065a052393768ceca8))


### Bug Fixes

* Add validation check for extra_packages when creating a reasoning engine. ([255dabc](https://github.com/googleapis/python-aiplatform/commit/255dabc77c647ef3ac33a10b06b3a36db122118a))
* Add validation for langchain tools. ([a821d50](https://github.com/googleapis/python-aiplatform/commit/a821d50724da7136c90abd157a7086d6571f2c30))
* Fixed the vertexai.init partial initialization issues ([636a654](https://github.com/googleapis/python-aiplatform/commit/636a654590919048f84baf343d291711f28eb03e))
* GenAI - Workaround for streaming when content role is missing in service responses ([fa35b91](https://github.com/googleapis/python-aiplatform/commit/fa35b9169677c62a5f0fa746dc9db9a5296f44a3))


### Documentation

* Add Reasoning Engine reference documentation ([496fc4b](https://github.com/googleapis/python-aiplatform/commit/496fc4b96768c872c9e7312bacf9989ea6e979f5))
* GenAI - Add Rapid Evaluation SDK reference documentation ([40b728b](https://github.com/googleapis/python-aiplatform/commit/40b728b28210f2bc57374c6c6d507cf3fa0be038))

## [1.47.0](https://github.com/googleapis/python-aiplatform/compare/v1.46.0...v1.47.0) (2024-04-06)


Expand Down
21 changes: 18 additions & 3 deletions google/cloud/aiplatform/compat/services/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,12 +63,12 @@
from google.cloud.aiplatform_v1beta1.services.model_service import (
client as model_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.pipeline_service import (
client as pipeline_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.persistent_resource_service import (
client as persistent_resource_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.pipeline_service import (
client as pipeline_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.prediction_service import (
client as prediction_service_client_v1beta1,
)
Expand All @@ -90,10 +90,20 @@
from google.cloud.aiplatform_v1beta1.services.tensorboard_service import (
client as tensorboard_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.vertex_rag_data_service import (
client as vertex_rag_data_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.vertex_rag_data_service import (
async_client as vertex_rag_data_service_async_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.vertex_rag_service import (
client as vertex_rag_service_client_v1beta1,
)
from google.cloud.aiplatform_v1beta1.services.vizier_service import (
client as vizier_service_client_v1beta1,
)


from google.cloud.aiplatform_v1.services.dataset_service import (
client as dataset_service_client_v1,
)
Expand Down Expand Up @@ -195,9 +205,14 @@
pipeline_service_client_v1beta1,
prediction_service_client_v1beta1,
prediction_service_async_client_v1beta1,
reasoning_engine_execution_service_client_v1beta1,
reasoning_engine_service_client_v1beta1,
schedule_service_client_v1beta1,
specialist_pool_service_client_v1beta1,
metadata_service_client_v1beta1,
tensorboard_service_client_v1beta1,
vertex_rag_service_client_v1beta1,
vertex_rag_data_service_client_v1beta1,
vertex_rag_data_service_async_client_v1beta1,
vizier_service_client_v1beta1,
)
2 changes: 1 addition & 1 deletion google/cloud/aiplatform/gapic_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
__version__ = "1.47.0" # {x-release-please-version}
__version__ = "1.48.0" # {x-release-please-version}
47 changes: 25 additions & 22 deletions google/cloud/aiplatform/initializer.py
Original file line number Diff line number Diff line change
Expand Up @@ -192,23 +192,22 @@ def init(
ValueError:
If experiment_description is provided but experiment is not.
"""

if api_endpoint is not None:
self._api_endpoint = api_endpoint

# This method mutates state, so we need to be careful with the validation
# First, we need to validate all passed values
if api_transport:
VALID_TRANSPORT_TYPES = ["grpc", "rest"]
if api_transport not in VALID_TRANSPORT_TYPES:
raise ValueError(
f"{api_transport} is not a valid transport type. "
+ f"Valid transport types: {VALID_TRANSPORT_TYPES}"
)
if location:
utils.validate_region(location)
if experiment_description and experiment is None:
raise ValueError(
"Experiment needs to be set in `init` in order to add experiment descriptions."
)

if experiment_tensorboard and not isinstance(experiment_tensorboard, bool):
metadata._experiment_tracker.set_tensorboard(
tensorboard=experiment_tensorboard,
project=project,
location=location,
credentials=credentials,
)

# reset metadata_service config if project or location is updated.
if (project and project != self._project) or (
location and location != self._location
Expand All @@ -217,10 +216,14 @@ def init(
logging.info("project/location updated, reset Experiment config.")
metadata._experiment_tracker.reset()

# Then we change the main state
if api_endpoint is not None:
self._api_endpoint = api_endpoint
if api_transport:
self._api_transport = api_transport
if project:
self._project = project
if location:
utils.validate_region(location)
self._location = location
if staging_bucket:
self._staging_bucket = staging_bucket
Expand All @@ -233,22 +236,22 @@ def init(
if service_account is not None:
self._service_account = service_account

# Finally, perform secondary state updates
if experiment_tensorboard and not isinstance(experiment_tensorboard, bool):
metadata._experiment_tracker.set_tensorboard(
tensorboard=experiment_tensorboard,
project=project,
location=location,
credentials=credentials,
)

if experiment:
metadata._experiment_tracker.set_experiment(
experiment=experiment,
description=experiment_description,
backing_tensorboard=experiment_tensorboard,
)

if api_transport:
VALID_TRANSPORT_TYPES = ["grpc", "rest"]
if api_transport not in VALID_TRANSPORT_TYPES:
raise ValueError(
f"{api_transport} is not a valid transport type. "
+ f"Valid transport types: {VALID_TRANSPORT_TYPES}"
)
self._api_transport = api_transport

def get_encryption_spec(
self,
encryption_spec_key_name: Optional[str],
Expand Down
8 changes: 8 additions & 0 deletions google/cloud/aiplatform/jobs.py
Original file line number Diff line number Diff line change
Expand Up @@ -1924,6 +1924,7 @@ def from_local_script(
encryption_spec_key_name: Optional[str] = None,
staging_bucket: Optional[str] = None,
persistent_resource_id: Optional[str] = None,
tpu_topology: Optional[str] = None,
) -> "CustomJob":
"""Configures a custom job from a local script.
Expand Down Expand Up @@ -2034,6 +2035,12 @@ def from_local_script(
on-demand short-live machines. The network, CMEK, and node pool
configs on the job should be consistent with those on the
PersistentResource, otherwise, the job will be rejected.
tpu_topology (str):
Optional. Specifies the tpu topology to be used for
TPU training job. This field is required for TPU v5 versions. For
details on the TPU topology, refer to
https://cloud.google.com/tpu/docs/v5e#tpu-v5e-config. The topology
must be a supported value for the TPU machine type.
Raises:
RuntimeError: If staging bucket was not set using aiplatform.init
Expand Down Expand Up @@ -2063,6 +2070,7 @@ def from_local_script(
boot_disk_size_gb=boot_disk_size_gb,
reduction_server_replica_count=reduction_server_replica_count,
reduction_server_machine_type=reduction_server_machine_type,
tpu_topology=tpu_topology,
).pool_specs
)

Expand Down
12 changes: 12 additions & 0 deletions google/cloud/aiplatform/matching_engine/matching_engine_index.py
Original file line number Diff line number Diff line change
Expand Up @@ -434,6 +434,9 @@ def create_tree_ah_index(
encryption_spec_key_name: Optional[str] = None,
create_request_timeout: Optional[float] = None,
shard_size: Optional[str] = None,
feature_norm_type: Optional[
matching_engine_index_config.FeatureNormType
] = None,
) -> "MatchingEngineIndex":
"""Creates a MatchingEngineIndex resource that uses the tree-AH algorithm.
Expand Down Expand Up @@ -477,6 +480,8 @@ def create_tree_ah_index(
range 1-100, inclusive. The default value is 10 (means 10%) if not set.
distance_measure_type (matching_engine_index_config.DistanceMeasureType):
Optional. The distance measure used in nearest neighbor search.
feature_norm_type (matching_engine_index_config.FeatureNormType):
Optional. The feature norm type used in nearest neighbor search.
description (str):
Optional. The description of the Index.
labels (Dict[str, str]):
Expand Down Expand Up @@ -552,6 +557,7 @@ def create_tree_ah_index(
algorithm_config=algorithm_config,
approximate_neighbors_count=approximate_neighbors_count,
distance_measure_type=distance_measure_type,
feature_norm_type=feature_norm_type,
shard_size=shard_size,
)

Expand Down Expand Up @@ -580,6 +586,9 @@ def create_brute_force_index(
distance_measure_type: Optional[
matching_engine_index_config.DistanceMeasureType
] = None,
feature_norm_type: Optional[
matching_engine_index_config.FeatureNormType
] = None,
description: Optional[str] = None,
labels: Optional[Dict[str, str]] = None,
project: Optional[str] = None,
Expand Down Expand Up @@ -623,6 +632,8 @@ def create_brute_force_index(
Required. The number of dimensions of the input vectors.
distance_measure_type (matching_engine_index_config.DistanceMeasureType):
Optional. The distance measure used in nearest neighbor search.
feature_norm_type (matching_engine_index_config.FeatureNormType):
Optional. The feature norm type used in nearest neighbor search.
description (str):
Optional. The description of the Index.
labels (Dict[str, str]):
Expand Down Expand Up @@ -695,6 +706,7 @@ def create_brute_force_index(
dimensions=dimensions,
algorithm_config=algorithm_config,
distance_measure_type=distance_measure_type,
feature_norm_type=feature_norm_type,
shard_size=shard_size,
)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -125,12 +125,15 @@ class MatchingEngineIndexConfig:
independently.
distance_measure_type (DistanceMeasureType):
Optional. The distance measure used in nearest neighbor search.
feature_norm_type (FeatureNormType):
Optional. The feature norm type used in nearest neighbor search.
"""

dimensions: int
algorithm_config: AlgorithmConfig
approximate_neighbors_count: Optional[int] = None
distance_measure_type: Optional[DistanceMeasureType] = None
feature_norm_type: Optional[FeatureNormType] = None
shard_size: Optional[str] = None

def as_dict(self) -> Dict[str, Any]:
Expand All @@ -144,6 +147,7 @@ def as_dict(self) -> Dict[str, Any]:
"algorithmConfig": self.algorithm_config.as_dict(),
"approximateNeighborsCount": self.approximate_neighbors_count,
"distanceMeasureType": self.distance_measure_type,
"featureNormType": self.feature_norm_type,
"shardSize": self.shard_size,
}
return res
Loading

0 comments on commit ac88c1b

Please sign in to comment.