Skip to content

Conversation

release-please[bot]
Copy link
Contributor

@release-please release-please bot commented Sep 4, 2025

🤖 I have created a release beep boop

2.19.0 (2025-09-09)

Features

Bug Fixes

  • Fix issue mishandling chunked array while loading data (#2051) (873d0ee)
  • Remove warning for slot_millis_sum (#2047) (425a691)

This PR was generated with Release Please. See documentation.

@release-please release-please bot requested review from a team as code owners September 4, 2025 15:25
@product-auto-label product-auto-label bot added the size: s Pull request size is small. label Sep 4, 2025
@trusted-contributions-gcf trusted-contributions-gcf bot added kokoro:force-run Add this label to force Kokoro to re-run the tests. owlbot:run Add this label to trigger the Owlbot post processor. labels Sep 4, 2025
@product-auto-label product-auto-label bot added the api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. label Sep 4, 2025
@bigframes-bot bigframes-bot removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Sep 4, 2025
@gcf-owl-bot gcf-owl-bot bot removed the owlbot:run Add this label to trigger the Owlbot post processor. label Sep 4, 2025
@release-please release-please bot force-pushed the release-please--branches--main branch from b842e93 to 237b734 Compare September 5, 2025 14:51
@trusted-contributions-gcf trusted-contributions-gcf bot added kokoro:force-run Add this label to force Kokoro to re-run the tests. owlbot:run Add this label to trigger the Owlbot post processor. labels Sep 5, 2025
@gcf-owl-bot gcf-owl-bot bot removed the owlbot:run Add this label to trigger the Owlbot post processor. label Sep 5, 2025
@bigframes-bot bigframes-bot removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Sep 5, 2025
@release-please release-please bot changed the title chore(main): release 2.18.1 chore(main): release 2.19.0 Sep 5, 2025
@release-please release-please bot force-pushed the release-please--branches--main branch from 237b734 to 3620d30 Compare September 5, 2025 19:31
@trusted-contributions-gcf trusted-contributions-gcf bot added kokoro:force-run Add this label to force Kokoro to re-run the tests. owlbot:run Add this label to trigger the Owlbot post processor. labels Sep 5, 2025
@gcf-owl-bot gcf-owl-bot bot removed the owlbot:run Add this label to trigger the Owlbot post processor. label Sep 5, 2025
@bigframes-bot bigframes-bot removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Sep 5, 2025
@release-please release-please bot force-pushed the release-please--branches--main branch from 3620d30 to 3fe1d99 Compare September 5, 2025 23:14
@trusted-contributions-gcf trusted-contributions-gcf bot added kokoro:force-run Add this label to force Kokoro to re-run the tests. owlbot:run Add this label to trigger the Owlbot post processor. labels Sep 5, 2025
@gcf-owl-bot gcf-owl-bot bot removed the owlbot:run Add this label to trigger the Owlbot post processor. label Sep 5, 2025
@yoshi-kokoro yoshi-kokoro removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Sep 5, 2025
@yuvalgimmunai
Copy link

Would be highly appreciated if you could release it today @TrevorBergeron @tswast

@tswast
Copy link
Collaborator

tswast commented Sep 9, 2025

Thanks for the reminder, @yuvalgimmunai ! I'm hopeful to get #2059 in this morning, too. If all looks good before our on-call this week @jialuoo gets in today, I can gladly get this released.

@release-please release-please bot force-pushed the release-please--branches--main branch from 3fe1d99 to b7b1e5c Compare September 9, 2025 14:49
@trusted-contributions-gcf trusted-contributions-gcf bot added kokoro:force-run Add this label to force Kokoro to re-run the tests. owlbot:run Add this label to trigger the Owlbot post processor. labels Sep 9, 2025
@gcf-owl-bot gcf-owl-bot bot removed the owlbot:run Add this label to trigger the Owlbot post processor. label Sep 9, 2025
@bigframes-bot bigframes-bot removed the kokoro:force-run Add this label to force Kokoro to re-run the tests. label Sep 9, 2025
@tswast
Copy link
Collaborator

tswast commented Sep 9, 2025

Presubmit failure seems like a flake:

_________________ test_remote_function_direct_no_session_param _________________
[gw1] linux -- Python 3.9.20 /tmpfs/src/github/python-bigquery-dataframes/.nox/system-3-9/bin/python

bigquery_client = <google.cloud.bigquery.client.Client object at 0x14a0860b3ca0>
bigqueryconnection_client = <google.cloud.bigquery_connection_v1.services.connection_service.client.ConnectionServiceClient object at 0x14a0844a06d0>
cloudfunctions_client = <google.cloud.functions_v2.services.function_service.client.FunctionServiceClient object at 0x14a086011190>
resourcemanager_client = <google.cloud.resourcemanager_v3.services.projects.client.ProjectsClient object at 0x14a0842e8dc0>
scalars_dfs = (          bool_col                                          bytes_col  \
rowindex                                    ...     <NA>
7             True  ...     0:00:00.000004
8            False  ...    5 days, 0:00:00

[9 rows x 14 columns])
dataset_id_permanent = 'bigframes-dev.bigframes_testing'
bq_cf_connection = 'bigframes-rf-conn'

    def test_remote_function_direct_no_session_param(
        bigquery_client,
        bigqueryconnection_client,
        cloudfunctions_client,
        resourcemanager_client,
        scalars_dfs,
        dataset_id_permanent,
        bq_cf_connection,
    ):
        def square(x):
            return x * x
    
>       square = bff.remote_function(
            input_types=int,
            output_type=int,
            bigquery_client=bigquery_client,
            bigquery_connection_client=bigqueryconnection_client,
            cloud_functions_client=cloudfunctions_client,
            resource_manager_client=resourcemanager_client,
            dataset=dataset_id_permanent,
            bigquery_connection=bq_cf_connection,
            # See e2e tests for tests that actually deploy the Cloud Function.
            reuse=True,
            name=get_function_name(square),
            cloud_function_service_account="default",
        )(square)

[tests/system/small/functions/test_remote_function.py:116](https://cs.corp.google.com/piper///depot/google3/tests/system/small/functions/test_remote_function.py?l=116): 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
[bigframes/functions/_function_session.py:617](https://cs.corp.google.com/piper///depot/google3/bigframes/functions/_function_session.py?l=617): in wrapper
    ) = remote_function_client.provision_bq_remote_function(
[bigframes/functions/_function_client.py:593](https://cs.corp.google.com/piper///depot/google3/bigframes/functions/_function_client.py?l=593): in provision_bq_remote_function
    cf_endpoint = self.create_cloud_function(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <bigframes.functions._function_client.FunctionClient object at 0x14a084486b80>
def_ = <function test_remote_function_direct_no_session_param.<locals>.square at 0x14a08448b040>

    def create_cloud_function(
        self,
        def_,
        *,
        random_name,
        input_types: Tuple[str],
        output_type: str,
        package_requirements=None,
        timeout_seconds=600,
        max_instance_count=None,
        is_row_processor=False,
        vpc_connector=None,
        vpc_connector_egress_settings="private-ranges-only",
        memory_mib=1024,
        ingress_settings="internal-only",
    ):
        """Create a cloud function from the given user defined function."""
    
        # Build and deploy folder structure containing cloud function
        with tempfile.TemporaryDirectory() as directory:
            entry_point = self.generate_cloud_function_code(
                def_,
                directory,
                package_requirements=package_requirements,
                input_types=input_types,
                output_type=output_type,
                is_row_processor=is_row_processor,
            )
            archive_path = shutil.make_archive(directory, "zip", directory)
    
            # We are creating cloud function source code from the currently running
            # python version. Use the same version to deploy. This is necessary
            # because cloudpickle serialization done in one python version and
            # deserialization done in another python version doesn't work.
            # TODO(shobs): Figure out how to achieve version compatibility, specially
            # when pickle (internally used by cloudpickle) guarantees that:
            # [https://docs.python.org/3/library/pickle.html#:~:text=The%20pickle%20serialization%20format%20is,unique%20breaking%20change%20language%20boundary](https://www.google.com/url?q=https://docs.python.org/3/library/pickle.html%23:~:text%3DThe%2520pickle%2520serialization%2520format%2520is,unique%2520breaking%2520change%2520language%2520boundary&sa=D).
            python_version = _utils.get_python_version(is_compat=True)
    
            # Determine an upload URL for user code
            upload_url_request = functions_v2.GenerateUploadUrlRequest(
                kms_key_name=self._cloud_function_kms_key_name
            )
            upload_url_request.parent = self.get_cloud_function_fully_qualified_parent()
            upload_url_response = self._cloud_functions_client.generate_upload_url(
                request=upload_url_request
            )
    
            # Upload the code to GCS
            with open(archive_path, "rb") as f:
                response = requests.put(
                    upload_url_response.upload_url,
                    data=f,
                    headers={"content-type": "application/zip"},
                )
                if response.status_code != 200:
                    raise bf_formatting.create_exception_with_feedback_link(
                        RuntimeError,
                        f"Failed to upload user code. code={response.status_code}, reason={response.reason}, text={response.text}",
                    )
    
            # Deploy Cloud Function
            create_function_request = functions_v2.CreateFunctionRequest()
            create_function_request.parent = (
                self.get_cloud_function_fully_qualified_parent()
            )
            create_function_request.function_id = random_name
            function = functions_v2.Function()
            function.name = self.get_cloud_function_fully_qualified_name(random_name)
            function.build_config = functions_v2.BuildConfig()
            function.build_config.runtime = python_version
            function.build_config.entry_point = entry_point
            function.build_config.source = functions_v2.Source()
            function.build_config.source.storage_source = functions_v2.StorageSource()
            function.build_config.source.storage_source.bucket = (
                upload_url_response.storage_source.bucket
            )
            function.build_config.source.storage_source.object_ = (
                upload_url_response.storage_source.object_
            )
            function.build_config.docker_repository = (
                self._cloud_function_docker_repository
            )
    
            if self._cloud_build_service_account:
                canonical_cloud_build_service_account = (
                    self._cloud_build_service_account
                    if "/" in self._cloud_build_service_account
                    else f"projects/{self._gcp_project_id}/serviceAccounts/{self._cloud_build_service_account}"
                )
                function.build_config.service_account = (
                    canonical_cloud_build_service_account
                )
    
            function.service_config = functions_v2.ServiceConfig()
            if memory_mib is not None:
                function.service_config.available_memory = f"{memory_mib}Mi"
            if timeout_seconds is not None:
                if timeout_seconds > 1200:
                    raise bf_formatting.create_exception_with_feedback_link(
                        ValueError,
                        "BigQuery remote function can wait only up to 20 minutes"
                        ", see for more details "
                        "https://cloud.google.com/bigquery/quotas#remote_function_limits.",
                    )
                function.service_config.timeout_seconds = timeout_seconds
            if max_instance_count is not None:
                function.service_config.max_instance_count = max_instance_count
            if vpc_connector is not None:
                function.service_config.vpc_connector = vpc_connector
                if vpc_connector_egress_settings not in _VPC_EGRESS_SETTINGS_MAP:
                    raise bf_formatting.create_exception_with_feedback_link(
                        ValueError,
                        f"'{vpc_connector_egress_settings}' not one of the supported vpc egress settings values: {list(_VPC_EGRESS_SETTINGS_MAP)}",
                    )
                function.service_config.vpc_connector_egress_settings = cast(
                    functions_v2.ServiceConfig.VpcConnectorEgressSettings,
                    _VPC_EGRESS_SETTINGS_MAP[vpc_connector_egress_settings],
                )
            function.service_config.service_account_email = (
                self._cloud_function_service_account
            )
            if ingress_settings not in _INGRESS_SETTINGS_MAP:
                raise bf_formatting.create_exception_with_feedback_link(
                    ValueError,
                    f"'{ingress_settings}' not one of the supported ingress settings values: {list(_INGRESS_SETTINGS_MAP)}",
                )
            function.service_config.ingress_settings = cast(
                functions_v2.ServiceConfig.IngressSettings,
                _INGRESS_SETTINGS_MAP[ingress_settings],
            )
            function.kms_key_name = self._cloud_function_kms_key_name
            create_function_request.function = function
    
            # Create the cloud function and wait for it to be ready to use
            try:
                operation = self._cloud_functions_client.create_function(
                    request=create_function_request
                )
                operation.result()
    
                # Cleanup
                os.remove(archive_path)
            except google.api_core.exceptions.AlreadyExists:
                # [b/437124912](http://b/437124912): The most likely scenario is that
                # `create_function` had a retry due to a network issue. The
                # retried request then fails because the first call actually
                # succeeded, but we didn't get the successful response back.
                #
                # Since the function name was randomly chosen to avoid
                # conflicts, we know the AlreadyExist can only happen because
                # we created it. This error is safe to ignore.
                pass
    
        # Fetch the endpoint of the just created function
        endpoint = self.get_cloud_function_endpoint(random_name)
        if not endpoint:
>           raise bf_formatting.create_exception_with_feedback_link(
                ValueError, "Couldn't fetch the http endpoint."
            )
E           ValueError: Couldn't fetch the http endpoint. Share your usecase with the BigQuery DataFrames team at the [https://bit.ly/bigframes-feedback](https://www.google.com/url?q=https://bit.ly/bigframes-feedback&sa=D) survey. You are currently running BigFrames version 2.19.0.

[bigframes/functions/_function_client.py:532](https://cs.corp.google.com/piper///depot/google3/bigframes/functions/_function_client.py?l=532): ValueError
=============================== warnings summary ===============================**

@tswast tswast enabled auto-merge (squash) September 9, 2025 15:04
@tswast tswast merged commit 7deb6c0 into main Sep 9, 2025
21 of 25 checks passed
@tswast tswast deleted the release-please--branches--main branch September 9, 2025 15:27
Copy link
Contributor Author

release-please bot commented Sep 9, 2025

🤖 Created releases:

🌻

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. autorelease: tagged size: s Pull request size is small.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants