Releases: Netflix/metaflow
2.12.9
What's Changed
- [card] bug fix in error card rendering by @valayDave in #1923
- [parallel-fixes] tag-catch test and @Secrets simplification for @parallel by @valayDave in #1917
- [parallel-fixes] core + test changes by @valayDave in #1925
- [refactor-jobsets] refactor to new implementation by @valayDave in #1914
- [argo] support for @parallel by @valayDave in #1927
- [ubf] bug fix when using
merge_artifacts
in UBF joins by @valayDave in #1928 - [Ready for Review] Improve native resume by @darinyu in #1884
- deployer with new injection mechanism by @madhur-ob in #1910
- [flowspec] add artifacts to exclude in
merge_artifacts
by @valayDave in #1929 - new release 2.12.9 by @savingoyal in #1933
Full Changelog: 2.12.8...2.12.9
2.12.8
What's Changed
- pin micromamba version by @savingoyal in #1920
- bump to 2.12.8 by @savingoyal in #1921
Full Changelog: 2.12.7...2.12.8
2.12.7
Improvements
Fix Argo Events escaping HTML characters
This release fixes an issue where values passed to flow parameters from an Argo event would unintentionally escape HTML characters. The value in the payload should now be passed as-is to the flow parameter.
What's Changed
- fix: do not escape html in Argo Events payload by @saikonen in #1911
- Fix issue with current_namespace when the namespace is None by @romain-intel in #1901
- Expose metaflow logger and monitor via singleton by @talsperre in #1794
- remove old py versions for s3 tests by @savingoyal in #1906
- bump version to 2.12.7 by @saikonen in #1912
New Contributors
- @talsperre made their first contribution in #1794
Full Changelog: 2.12.6...2.12.7
2.12.6
Improvements
Fix Argo Workflows issue with long static splits
This release fixes an issue where a join step of a static split would fail on Argo Workflows in rare cases where the length of step names exceeded a threshold.
Argo Events support for parameter names with dashes
Fixes an issue where values from an Argo Event payload did not correctly map to flow Parameters if the parameter name contained dashes.
More specific PyPI errors for package resolving
The errors for @pypi
should now be more clear in cases where it is unable to successfully resolve an environment due to not finding a suitable package.
What's Changed
- Bump braces from 3.0.2 to 3.0.3 in /metaflow/plugins/cards/ui by @dependabot in #1898
- feature: polish pypi package errors by @saikonen in #1905
- fix: long names with static split fails join step on Argo Workflows by @saikonen in #1907
- feature: Conda env extension hooks by @saikonen in #1902
- fix: support for dashed parameters through argo events by @saikonen in #1908
- bump version to 2.12.6 by @saikonen in #1909
Full Changelog: 2.12.5...2.12.6
2.12.5
What's Changed
- add timestamps to conda debug logs by @savingoyal in #1889
- fix: decorator attributes being modified in get_environment by @saikonen in #1895
- bump version to 2.12.5 by @saikonen in #1896
- [cards] bug fix with error card renders by @valayDave in #1893
Full Changelog: 2.12.4...2.12.5
2.12.4
What's Changed
- Fix escape hatch to make getattr behavior more standard by @romain-intel in #1883
- Fix/remember namespace by @romain-intel in #1873
- Remove global nature of parameters and flow decorators by @romain-intel in #1886
- Install PyPI packages in interpreter's site packages by @savingoyal in #1890
- bump version to 2.12.4 by @saikonen in #1891
Full Changelog: 2.12.3...2.12.4
2.12.3
2.12.2
What's Changed
- Fix issue with setting metadata when using the runner by @romain-intel in #1875
- Bump version to 2.12.2 by @romain-intel in #1876
Full Changelog: 2.12.1...2.12.2
2.12.1
Features
Configurable default decorators
This release adds the ability to configure default decorators that will be applied to all steps. This is achieved by setting the decospecs as a value (space separated) for METAFLOW_DECOSPECS
either as an environment variable or in a config.json
The following example would add retry and kubernetes decorators with a custom memory value to all steps:
export METAFLOW_DECOSPECS="kubernetes:memory=4096 retry"
Defining a decorator with the --with
keyword will override the defaults configured. Same applies for explicitly adding a decorator in the flow file.
Improvements
Correctly clean up Argo Workflow sensors when using @project
This release fixes an issue where argo-workflows delete
did not correctly remove possible sensors associated with the workflow if the workflow used the @project
decorator.
What's Changed
- Add the possibility of defining default decorators for steps by @romain-intel in #1837
- bugfix: properly deletes Argo Events trigger sensors when
@project
is used by @gabriel-rp in #1871 - S3PubObject was not used properly after #1807 by @romain-intel in #1872
- bump version to 2.12.1 by @saikonen in #1874
New Contributors
- @gabriel-rp made their first contribution in #1871
Full Changelog: 2.12.0...2.12.1
2.12.0
Features
Support running flows in notebooks and through Python scripts
This release introduces a new Runner API that makes it simple to run flows inside Notebooks, or as part of Python code.
Read the blog post on the feature, or dive straight into the documentation to start using it.
What's Changed
- Remove dead code by @savingoyal in #1853
- initial runner api by @madhur-ob in #1732
- synchronous run and resume functionality + nbrun for runner API by @madhur-ob in #1845
- Tests for runner by @savingoyal in #1859
- minor nbrun fixes by @madhur-ob in #1860
- fix leaked message by @madhur-ob in #1861
- raise exception instead by @madhur-ob in #1862
- allow output to be hidden in nbrun() by @tuulos in #1864
- show_output as True by default by @madhur-ob in #1865
- add explicit cleanup() methods in Runners by @tuulos in #1863
- Runner docstring fixes by @tuulos in #1866
- release 2.12.0 by @savingoyal in #1867
Full Changelog: 2.11.16...2.12.0