-
Notifications
You must be signed in to change notification settings - Fork 20
Andystaples/add functions support #75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This draft PR introduces Azure Functions support for the durabletask-python library by creating a new durabletask-azurefunctions package. This allows developers to use Durable Task patterns within Azure Functions using Python decorators and bindings that integrate with the Azure Functions worker.
Key Changes:
- Added a new
durabletask-azurefunctionspackage with decorators, client, and worker implementations for Azure Functions integration - Modified core
durabletask/worker.pyto support a newProtoTaskHubSidecarServiceStubtype alongside the existing gRPC stub - Introduced a base
ProtoTaskHubSidecarServiceStubclass that can be extended for different communication patterns
Reviewed changes
Copilot reviewed 12 out of 15 changed files in this pull request and generated 22 comments.
Show a summary per file
| File | Description |
|---|---|
durabletask/worker.py |
Added import for new stub type, updated type hints to accept Union of stub types, added handling for orchestratorCompleted event |
durabletask/internal/ProtoTaskHubSidecarServiceStub.py |
New base stub class defining the protocol interface with callable attributes for all Task Hub operations |
durabletask-azurefunctions/pyproject.toml |
Package configuration for the new Azure Functions integration package with dependencies |
durabletask-azurefunctions/durabletask/azurefunctions/worker.py |
Worker implementation that extends TaskHubGrpcWorker without async worker loop for Functions execution model |
durabletask-azurefunctions/durabletask/azurefunctions/client.py |
Client implementation for Azure Functions that parses connection info from JSON and uses custom interceptors |
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_null_stub.py |
Null stub implementation that provides no-op lambdas for all stub operations |
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_grpc_interceptor.py |
Custom gRPC interceptor that adds Azure Functions-specific headers |
durabletask-azurefunctions/durabletask/azurefunctions/decorators/metadata.py |
Trigger and binding metadata classes for orchestration, activity, entity, and client bindings |
durabletask-azurefunctions/durabletask/azurefunctions/decorators/durable_app.py |
Blueprint and DFApp classes providing decorators for registering Functions with Durable Task patterns |
durabletask-azurefunctions/durabletask/azurefunctions/decorators/__init__.py |
Package exports for decorator module |
durabletask-azurefunctions/durabletask/azurefunctions/constants.py |
Constants for trigger and binding type names |
durabletask-azurefunctions/CHANGELOG.md |
Initial changelog for the new package |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_null_stub.py
Outdated
Show resolved
Hide resolved
durabletask-azurefunctions/durabletask/azurefunctions/worker.py
Outdated
Show resolved
Hide resolved
durabletask-azurefunctions/durabletask/azurefunctions/worker.py
Outdated
Show resolved
Hide resolved
| creationUrls: dict[str, str] | ||
| managementUrls: dict[str, str] |
Copilot
AI
Nov 21, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Potential compatibility issue with type hint syntax. The use of dict[str, str] (PEP 585 style) requires Python 3.9+. While pyproject.toml specifies requires-python = ">=3.9", consider whether this is the intended minimum version or if Dict[str, str] from typing should be used for broader compatibility.
...letask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_grpc_interceptor.py
Show resolved
Hide resolved
durabletask-azurefunctions/durabletask/azurefunctions/internal/azurefunctions_null_stub.py
Outdated
Show resolved
Hide resolved
| if response is None: | ||
| raise Exception("Orchestrator execution did not produce a response.") | ||
| # The Python worker returns the input as type "json", so double-encoding is necessary | ||
| return '"' + base64.b64encode(response.SerializeToString()).decode('utf-8') + '"' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - Currently, the return value from here is passed on to the host as type "json" so the host attempts to Newtonsoft deserialize it back into an object before handing back to the Durable middleware for final decoding. This breaks, unless I double-encode with quotes as above. Is there a way to communicate to the worker that this is a plain string instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Investigating this - will need a little more time to test on my end
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is actually coming from the OrchestrationTriggerConverter done here. The type is hard-coded to json. Changing the type to string works in this scenario, but I'm unsure if there are other cases where json was intended
@classmethod
def encode(cls, obj: typing.Any, *, expected_type: typing.Optional[type]) -> meta.Datum:
# Durable function context should be a json
return meta.Datum(type='json', value=obj)
| # Obtain user-code and force type annotation on the client-binding parameter to be `str`. | ||
| # This ensures a passing type-check of that specific parameter, | ||
| # circumventing a limitation of the worker in type-checking rich DF Client objects. | ||
| # TODO: Once rich-binding type checking is possible, remove the annotation change. | ||
| user_code = fb._function._func | ||
| user_code.__annotations__[parameter_name] = str |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - this is the same approach taken by the existing Durable Python SDK for the DurableClient binding - we force the annotation to be "str" so the worker takes a path that does not attempt to use the DurableClientConverter input parameter converter, which would throw NotImplementedError
Do you think it is worth moving the client_constructor logic in this PR into the DurableClientConverter in the -library, so that we don't have to do this type-hacking stuff?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'd have to figure out how to detect which underlying provider for the durable_client_input binding is being used to know when to simply return the string for the old SDK vs parse it in the new
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The main issue would be that we'd have something different to return based on the durable library.
Are the types going to be the same? (eg DurableClient for both packages) We could look at creating two separate converters - right now it's using the Generic converter, but it would be better to have our own
| requires-python = ">=3.9" | ||
| license = {file = "LICENSE"} | ||
| readme = "README.md" | ||
| dependencies = [ | ||
| "durabletask>=0.5.0", | ||
| "azure-identity>=1.19.0", | ||
| "azure-functions>=1.11.0" | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: Update python min version and rev durabletask dependency to 1.0.1/1.1.0
Also rev durabletask versions to the same based on size of changes needed
|
|
||
| [project] | ||
| name = "durabletask.azurefunctions" | ||
| version = "0.1.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - what versioning strategy would you propose if the first version that goes to PyPi would be used for internal testing only?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dev versions work well - eg 1.0.0dev0 or 0.0.1dev0
| # TODO: Is there a better way to support retrieving the unwrapped user code? | ||
| df_client_middleware.client_function = fb._function._func # type: ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Victoria - not sure if you remember this context from a while back, but this is also carryover from the previous SDK - I added this line to make retrieving the "unwrapped" user code possible for the unit testing scenario - see
https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-unit-testing-python#unit-testing-trigger-functions
If possible, I'd like to see a "better" solution for the new SDK. Hate to re-open a can of worms here, though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I vaguely remember context, but we can sync again over specific requirements.
| stub = AzureFunctionsNullStub() | ||
| worker = DurableFunctionsWorker() | ||
| response: Optional[OrchestratorResponse] = None | ||
|
|
||
| def stub_complete(stub_response): | ||
| nonlocal response | ||
| response = stub_response | ||
| stub.CompleteOrchestratorTask = stub_complete |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of this is probably optimizable - do we really need to create a new stub and worker for each call? Can they be saved? Will look into this more at some point
|
Can you add an overview / more detail to the PR description? |
- Still needs eventSent and eventRecieved implementations
- Add new_uuid method to OrchestrationContext for deterministic replay-safe UUIDs - Fix entity locking behavior for Functions - Align _RuntimeOrchestrationContext param names with OrchestrationContext - Remap __init__.py files for new module - Update version to 0.0.1dev0 - Add docstrings to missing methods - Move code for executing orchestrators/entities to DurableFunctionsWorker - Add function metadata to triggers for detection by extension
| runs-on: ubuntu-latest | ||
| steps: | ||
| - uses: actions/checkout@v4 | ||
| - name: Set up Python 3.14 | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: 3.14 | ||
| - name: Install dependencies | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install setuptools wheel tox | ||
| pip install flake8 | ||
| - name: Run flake8 Linter | ||
| working-directory: durabletask-azurefunctions | ||
| run: flake8 . | ||
| - name: Run flake8 Linter | ||
| working-directory: tests/durabletask-azurefunctions | ||
| run: flake8 . | ||
|
|
||
| run-docker-tests: |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
| strategy: | ||
| fail-fast: false | ||
| matrix: | ||
| python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"] | ||
| env: | ||
| EMULATOR_VERSION: "latest" | ||
| needs: lint | ||
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout repository | ||
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Pull Docker image | ||
| run: docker pull mcr.microsoft.com/dts/dts-emulator:$EMULATOR_VERSION | ||
|
|
||
| - name: Run Docker container | ||
| run: | | ||
| docker run --name dtsemulator -d -p 8080:8080 mcr.microsoft.com/dts/dts-emulator:$EMULATOR_VERSION | ||
|
|
||
| - name: Wait for container to be ready | ||
| run: sleep 10 # Adjust if your service needs more time to start | ||
|
|
||
| - name: Set environment variables | ||
| run: | | ||
| echo "TASKHUB=default" >> $GITHUB_ENV | ||
| echo "ENDPOINT=http://localhost:8080" >> $GITHUB_ENV | ||
|
|
||
| - name: Install durabletask dependencies | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install flake8 pytest | ||
| pip install -r requirements.txt | ||
|
|
||
| - name: Install durabletask-azurefunctions dependencies | ||
| working-directory: examples | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install -r requirements.txt | ||
|
|
||
| - name: Install durabletask-azurefunctions locally | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| pip install . --no-deps --force-reinstall | ||
|
|
||
| - name: Install durabletask locally | ||
| run: | | ||
| pip install . --no-deps --force-reinstall | ||
|
|
||
| - name: Run the tests | ||
| working-directory: tests/durabletask-azurefunctions | ||
| run: | | ||
| pytest -m "dts" --verbose | ||
|
|
||
| publish: |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 4 hours ago
To resolve the detected issue, you should add a permissions block to the workflow at the root level (before the jobs: key). This approach ensures that all jobs in the workflow inherit restrictive permissions unless overridden. The minimal starting point is contents: read, which is sufficient for most workflows that need to access the repository contents but do not need to write. If some jobs (such as a release/publish job) require more permissive access, you can further add job-level permissions blocks with additional scopes.
Steps to fix:
- Edit
.github/workflows/durabletask-azurefunctions.yml. - Insert a
permissions:block directly after the workflow'sname:key and before theon:key. - The block should specify
contents: read. - This enforces least-privilege, and all jobs (unless overridden) now run with only read access to the repository contents.
No imports or code changes are required, as this is a configuration file. No other regions of code need to change.
-
Copy modified lines R2-R3
| @@ -1,4 +1,6 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) | ||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| push: |
| if: startsWith(github.ref, 'refs/tags/azurefunctions-v') # Only run if a matching tag is pushed | ||
| needs: run-docker-tests | ||
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout code | ||
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Extract version from tag | ||
| run: echo "VERSION=${GITHUB_REF#refs/tags/azurefunctions-v}" >> $GITHUB_ENV # Extract version from the tag | ||
|
|
||
| - name: Set up Python | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: "3.14" # Adjust Python version as needed | ||
|
|
||
| - name: Install dependencies | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install build twine | ||
|
|
||
| - name: Build package from directory durabletask-azurefunctions | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| python -m build | ||
|
|
||
| - name: Check package | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine check dist/* | ||
|
|
||
| - name: Publish package to PyPI | ||
| env: | ||
| TWINE_USERNAME: __token__ | ||
| TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN_AZUREFUNCTIONS }} # Store your PyPI API token in GitHub Secrets | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine upload dist/* No newline at end of file |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 4 hours ago
To fix this issue, explicitly set a permissions block for the whole workflow or for each job as needed. The ideal minimum is to set contents: read, unless the job requires other types of access. For the publish-release job, releasing to PyPI and not creating tags, releases, or manipulating pull requests, contents: read is sufficient (it only reads package files, not writing to the repo).
We should therefore add the following block near the top of the workflow (to affect all jobs, unless overridden):
permissions:
contents: readAlternatively, it could be added inside the publish-release job if only that job requires explicit restriction, but in almost all cases, setting it globally is clearer and safer (unless other jobs in the workflow require additional write privileges).
This change is made at the root of the YAML file, directly after the name: block and before the on: block.
No other code changes or external dependencies are required.
-
Copy modified lines R2-R3
| @@ -1,4 +1,6 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) | ||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| push: |
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout code | ||
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Extract version from tag | ||
| run: echo "VERSION=${GITHUB_REF#refs/tags/azurefunctions-v}" >> $GITHUB_ENV # Extract version from the tag | ||
|
|
||
| - name: Set up Python | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: "3.14" # Adjust Python version as needed | ||
|
|
||
| - name: Install dependencies | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install build twine | ||
|
|
||
| - name: Append dev to version in pyproject.toml | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| sed -i 's/^version = "\(.*\)"/version = "\1.dev${{ github.run_number }}"/' pyproject.toml | ||
|
|
||
| - name: Build package from directory durabletask-azurefunctions | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| python -m build | ||
|
|
||
| - name: Check package | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine check dist/* | ||
|
|
||
| - name: Publish package to PyPI | ||
| env: | ||
| TWINE_USERNAME: __token__ | ||
| TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN_AZUREFUNCTIONS }} # Store your PyPI API token in GitHub Secrets | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine upload dist/* No newline at end of file |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 4 hours ago
The recommended fix is to explicitly add a permissions: block to the workflow. This block should be placed at the top level of the workflow (directly after the name: field), thereby applying to all jobs unless jobs override it. The safest minimum is permissions: contents: read, which will suffice unless more elevated privileges are required (not evidenced in any of the given steps). No additional imports, methods, or new dependencies are needed; this is a YAML configuration change. Edit .github/workflows/durabletask-azurefunctions-dev.yml to add:
permissions:
contents: readimmediately after the name: field at line 1, before the on: trigger.
-
Copy modified lines R2-R3
| @@ -1,4 +1,6 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) Dev Release | ||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| workflow_run: |
| runs-on: ubuntu-latest | ||
| steps: | ||
| - name: Checkout code | ||
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Extract version from tag | ||
| run: echo "VERSION=${GITHUB_REF#refs/tags/azurefunctions-v}" >> $GITHUB_ENV # Extract version from the tag | ||
|
|
||
| - name: Set up Python | ||
| uses: actions/setup-python@v5 | ||
| with: | ||
| python-version: "3.14" # Adjust Python version as needed | ||
|
|
||
| - name: Install dependencies | ||
| run: | | ||
| python -m pip install --upgrade pip | ||
| pip install build twine | ||
|
|
||
| - name: Change the version in pyproject.toml to 0.0.0dev{github.run_number} | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| sed -i 's/^version = ".*"/version = "0.0.0.dev${{ github.run_number }}"/' pyproject.toml | ||
|
|
||
| - name: Build package from directory durabletask-azurefunctions | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| python -m build | ||
|
|
||
| - name: Check package | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine check dist/* | ||
|
|
||
| - name: Publish package to PyPI | ||
| env: | ||
| TWINE_USERNAME: __token__ | ||
| TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN_AZUREFUNCTIONS }} # Store your PyPI API token in GitHub Secrets | ||
| working-directory: durabletask-azurefunctions | ||
| run: | | ||
| twine upload dist/* No newline at end of file |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 3 hours ago
To fix the problem, we should add a permissions block to the workflow YAML file. The recommended approach is to set it at the root of the workflow, so all jobs in the workflow will inherit the minimal permissions unless overridden. Unless the workflow requires write access to repository contents, the safest default is contents: read, as recommended by CodeQL. Review the workflow steps: Code checkout, extracting tag versions, Python actions, and publishing to PyPI via a secret token, none of which require GITHUB_TOKEN write access. Therefore, adding the following block to the top level of the YAML file (right after the name: block) will implement the principle of least privilege:
permissions:
contents: readThis is to be inserted after the workflow name: and before the on: block for clarity and convention.
-
Copy modified lines R2-R3
| @@ -1,4 +1,6 @@ | ||
| name: Durable Task Scheduler SDK (durabletask-azurefunctions) Experimental Release | ||
| permissions: | ||
| contents: read | ||
|
|
||
| on: | ||
| push: |
DRAFT PR - for collaboration
Requires at minimum the changes here
Azure/azure-functions-durable-extension#3260
in durable WebJobs extension allowing gRPC protocol for Python