2.10.4
Features
Support for tracing
With this release it is possible to gather telemetry data using an opentelemetry endpoint.
Specifying an endpoint in one of the environment variables
METAFLOW_OTEL_ENDPOINT
METAFLOW_ZIPKIN_ENDPOINT
will enable the corresponding tracing provider.
Some additional dependencies are required for the tracing functionality in the execution environment. These can be installed in the base Docker image, or supplied through a conda environment. The relevant packages are
opentelemetry-sdk, opentelemetry-api, opentelemetry-instrumentation, opentelemetry-instrumentation-requests
and depending on your endpoint, either opentelemetry-exporter-otlp
or opentelemetry-exporter-zipkin
Custom index support for the pypi
decorator
The pypi
decorator now supports using a custom index in the users Pip configuration under global.index-url
.
This enables using private indices, even ones that require authentication.
For example the following would set up one authenticated and two extra non-authenticated indices for package resolution
pip config set global.index-url "https://user:token@example.com"
pip config set global.extra-index-url "https://extra.example.com https://extra2.example.com"
Specify Kubernetes job ephemeral storage size through resources
decorator
It is now possible to specify the ephemeral storage size for Kubernetes jobs when using the resources
decorator with the disk=
attribute.
Introduce argo-workflows status
command
Adds a command for easily checking the current status of a workflow on Argo workflows.
python flow.py argo-workflows status [run-id]
Improvements
Add more randomness to Kubernetes pod names to avoid collisions
There was an issue where relying solely on the Kubernetes apiserver for generating random pod names was resulting in significant collisions with sufficiently large number of executions.
This release adds more randomness to the pod names besides what is generated by Kubernetes.
Fix issues with resources
decorator in combination with step functions
This release fixes an issue where deploying flows on AWS Step Functions was failing in the following cases
@resources(shared_memory=)
with any value- combining
@resources
and@batch(use_tmpfs=True)
What's Changed
- Bump postcss from 8.4.24 to 8.4.31 in /metaflow/plugins/cards/ui by @dependabot in #1599
- introduce argo-workflows status by @savingoyal in #1600
- remove dependency on wget for micromamba by @savingoyal in #1601
- fix: resource decorator step functions issue by @saikonen in #1610
- feature: add status grouping to argo-workflows output by @saikonen in #1604
- fix: apply only supported attributes from resources decorator in batch by @saikonen in #1586
- fix: use correct deco for limiting keys. by @saikonen in #1611
- Otel oss integration by @wangchy27 in #1462
- Remove use of distutils from code (not setup.py yet) by @romain-intel in #1585
- Add a uid for the kubernetes job creation by @tylerpotts in #1588
- feature: support custom index config for pypi decorator by @saikonen in #1613
- Adding 'disk' parameter to @resources decorator by @jaypond in #1500
- Bump version to 2.10.4 by @saikonen in #1614
New Contributors
Full Changelog: 2.10.3...2.10.4