-
Notifications
You must be signed in to change notification settings - Fork 648
Add Azure AI Foundry support #9974
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Bug in the FoundryLocalManager where the model is always downloaded (not doing case-insensitive checks for local models) Bug that the WaitFor isn't propogating across and the web app in playground won't start
… requires Not sure this is 100% right as the ModelInfo value isn't available until the model is downloaded, so if you don't do WaitFor Aspire will build the connection string with some null values. Also, the WaitFor, while it pauses the web app from starting, doesn't then release it to start for some reason
This should be delegated to the hosting resource to provide a valid endpoint, either generated by something like Foundry Local, or provided as expected using an endpoint from the Azure resource
…nto sebros/aifoundry
{ | ||
await rns.PublishUpdateAsync(resource, state => state with | ||
{ | ||
State = KnownResourceStates.FailedToStart, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this fails, does it log the reason why?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The case when the service is not setup is handled before that, logging a message.
/// Gets the connection string template for the manifest for the resource. | ||
/// </summary> | ||
public ReferenceExpression ConnectionStringExpression => | ||
ReferenceExpression.Create($"Endpoint={AIFoundryApiEndpoint};EndpointAIInference={AIFoundryApiEndpoint}models"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about the Endpoint
being the actual endpoint
bicep output from the resource?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sebastienros - let's follow up on this in a follow up PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good. I just had a few comments.
{ | ||
ArgumentNullException.ThrowIfNull(resource); | ||
|
||
return resource.Annotations.OfType<EmulatorResourceAnnotation>().Any(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can extend this in the future to check for the container annotation as well.
* Create Aspire.Hosting.Azure.AIFoundry integration * Remove comments * Fix managed identity authentication * Define system identity on the Foundry resource * Shared Foundry SDK * Initial implementation of Foundry Location from poc codebase * First cut at model download support Bug in the FoundryLocalManager where the model is always downloaded (not doing case-insensitive checks for local models) Bug that the WaitFor isn't propogating across and the web app in playground won't start * Rename method to proper American English * Fix connection string resolution for local deployments * Set api key for foundry local * Simplify deployment and pass Model for OpenAI * Ensuring a user message is always present * Fixing connection string to use ModelId not name, which Foundry Local requires Not sure this is 100% right as the ModelInfo value isn't available until the model is downloaded, so if you don't do WaitFor Aspire will build the connection string with some null values. Also, the WaitFor, while it pauses the web app from starting, doesn't then release it to start for some reason * Removing override on the endpoint for AI Inference SDK This should be delegated to the hosting resource to provide a valid endpoint, either generated by something like Foundry Local, or provided as expected using an endpoint from the Azure resource * Cleanup properties * wip * Refactor resources * Refactor resources * Fix connection string for OpenAI client * Update Inference clients to add /models as necessary * Add tests * Add more tests * Update readme * Update sample * Update src/Aspire.Hosting.Azure.AIFoundry/README.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Clean up Aspire.Hosting.Azure.AIFoundry.csproj * Use Foundry SDK package * Disable strong name generation * Update src/Aspire.Hosting.Azure.AIFoundry/Aspire.Hosting.Azure.AIFoundry.csproj Co-authored-by: Eric Erhardt <eric.erhardt@microsoft.com> * Cleanup Extensions * Merge extensions classes * Replace Azure AI Service usage * Improve custom credential scopes * Remove unnecessary reference * Define EndpointAIInference * Fix build * More feedback * More feedback * Rename IsLocal to IsEmulator * Reuse resource instance for emulator * Fix emulator detection * Remove IResourceWithEndpoint * Update logo * Update manifests * Fix test when foundry in setup on the machine * Remove custom builder * Remove WithManifestPublishingCallback * Log when model download fails --------- Co-authored-by: Aaron Powell <me@aaron-powell.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Eric Erhardt <eric.erhardt@microsoft.com>
* Create Aspire.Hosting.Azure.AIFoundry integration * Remove comments * Fix managed identity authentication * Define system identity on the Foundry resource * Shared Foundry SDK * Initial implementation of Foundry Location from poc codebase * First cut at model download support Bug in the FoundryLocalManager where the model is always downloaded (not doing case-insensitive checks for local models) Bug that the WaitFor isn't propogating across and the web app in playground won't start * Rename method to proper American English * Fix connection string resolution for local deployments * Set api key for foundry local * Simplify deployment and pass Model for OpenAI * Ensuring a user message is always present * Fixing connection string to use ModelId not name, which Foundry Local requires Not sure this is 100% right as the ModelInfo value isn't available until the model is downloaded, so if you don't do WaitFor Aspire will build the connection string with some null values. Also, the WaitFor, while it pauses the web app from starting, doesn't then release it to start for some reason * Removing override on the endpoint for AI Inference SDK This should be delegated to the hosting resource to provide a valid endpoint, either generated by something like Foundry Local, or provided as expected using an endpoint from the Azure resource * Cleanup properties * wip * Refactor resources * Refactor resources * Fix connection string for OpenAI client * Update Inference clients to add /models as necessary * Add tests * Add more tests * Update readme * Update sample * Update src/Aspire.Hosting.Azure.AIFoundry/README.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Clean up Aspire.Hosting.Azure.AIFoundry.csproj * Use Foundry SDK package * Disable strong name generation * Update src/Aspire.Hosting.Azure.AIFoundry/Aspire.Hosting.Azure.AIFoundry.csproj Co-authored-by: Eric Erhardt <eric.erhardt@microsoft.com> * Cleanup Extensions * Merge extensions classes * Replace Azure AI Service usage * Improve custom credential scopes * Remove unnecessary reference * Define EndpointAIInference * Fix build * More feedback * More feedback * Rename IsLocal to IsEmulator * Reuse resource instance for emulator * Fix emulator detection * Remove IResourceWithEndpoint * Update logo * Update manifests * Fix test when foundry in setup on the machine * Remove custom builder * Remove WithManifestPublishingCallback * Log when model download fails --------- Co-authored-by: Aaron Powell <me@aaron-powell.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Eric Erhardt <eric.erhardt@microsoft.com>
Description
Add support for Azure AI Foundry models and Foundry Local.
Fixes #9012 #9568
Checklist
<remarks />
and<code />
elements on your triple slash comments?