feat: add Azure AI Foundry provider support#6969
feat: add Azure AI Foundry provider support#6969alexpwrd wants to merge 2 commits intoopenclaw:mainfrom
Conversation
Adds native support for Azure AI Foundry (now Microsoft Foundry), allowing users to connect OpenAI and Anthropic models hosted on Azure under a single API key. - Two provider IDs: azure-ai-foundry (OpenAI) and azure-ai-foundry-anthropic (Anthropic) - Static model catalog with common Azure model IDs (GPT-5/4.1/4o series, Claude 4.5/4/3.5 series, o3/o4 reasoning) - Auth fallback chain: AZURE_AI_FOUNDRY_API_KEY → AZURE_FOUNDRY_API_KEY → AZURE_OPENAI_API_KEY - Endpoint discovery via AZURE_FOUNDRY_OPENAI_ENDPOINT and AZURE_FOUNDRY_ANTHROPIC_ENDPOINT - Custom deployment names supported via explicit models.providers config Closes openclaw#6056
| const azureFoundryKey = | ||
| resolveEnvApiKeyVarName("azure-ai-foundry") ?? | ||
| resolveApiKeyFromProfiles({ provider: "azure-ai-foundry", store: authStore }); | ||
| if (azureFoundryKey) { |
There was a problem hiding this comment.
azureFoundryKey is being set to the env var name (via resolveEnvApiKeyVarName) rather than a resolved credential value. That means the provider config may contain the string name of an env var instead of an actual token, leading to authentication failures when the implicit provider is used.
This should likely mirror the other implicit providers here by resolving a concrete key value for apiKey, and only using resolveEnvApiKeyVarName when you intentionally want an env-var placeholder.
Also appears at src/agents/models-config.providers.ts:632-634, 647-649, 654-656, 661-663, 676-678, 684-686.
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/models-config.providers.ts
Line: 692:695
Comment:
`azureFoundryKey` is being set to the env var *name* (via `resolveEnvApiKeyVarName`) rather than a resolved credential value. That means the provider config may contain the string name of an env var instead of an actual token, leading to authentication failures when the implicit provider is used.
This should likely mirror the other implicit providers here by resolving a concrete key value for `apiKey`, and only using `resolveEnvApiKeyVarName` when you intentionally want an env-var placeholder.
Also appears at src/agents/models-config.providers.ts:632-634, 647-649, 654-656, 661-663, 676-678, 684-686.
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
This follows the same pattern as all other implicit providers in this file (Minimax, Moonshot, Ollama, Venice, Xiaomi, Synthetic) — see lines 633, 648, 661, 676, 684. resolveEnvApiKeyVarName returns the env var name intentionally; the runtime resolves it downstream. Not a regression.
| "azure-ai-foundry-anthropic": { | ||
| baseUrl: "https://your-resource.openai.azure.com/anthropic", | ||
| apiKey: "${AZURE_AI_FOUNDRY_API_KEY}", | ||
| api: "anthropic-messages", |
There was a problem hiding this comment.
The example uses an apiKey value that looks like an env-var interpolation. The runtime normalizer strips that wrapper and stores just the env var name, which is fine if users go through the normal models.json generation path; but if someone copies this into a context expecting a literal key value, it may be confusing. Consider adding a short note that this apiKey form is treated as an env-var reference and normalized accordingly.
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
Prompt To Fix With AI
This is a comment left during a code review.
Path: docs/providers/azure-ai-foundry.md
Line: 98:101
Comment:
The example uses an `apiKey` value that looks like an env-var interpolation. The runtime normalizer strips that wrapper and stores just the env var name, which is fine if users go through the normal `models.json` generation path; but if someone copies this into a context expecting a literal key value, it may be confusing. Consider adding a short note that this `apiKey` form is treated as an env-var reference and normalized accordingly.
<sub>Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!</sub>
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
Same pattern as the Venice provider docs (line 243). This is the standard convention for env-var references in explicit config examples throughout the project.
|
Hello, have you tested this with some of the other models that azure provides too such as deepseek? |
|
great work. @alexpwrd @Takhoffman is this coming to release soon? |
|
when will this feature release? i am waiting for the Azure Foundry support as well. |
|
Thank you @alexpwrd great work! |
|
really looking for this to get in |
|
Really looking for this to get in as well |
|
Closing as duplicate of #12059. If this is incorrect, comment and we can reopen. |
|
@sebslight I don't get it, why close this one and not the #12059? |
|
Can this be reopened and merged, please? |
Summary
azure-ai-foundry(OpenAI-compatible) andazure-ai-foundry-anthropic(Anthropic-compatible)AZURE_AI_FOUNDRY_API_KEY→AZURE_FOUNDRY_API_KEY→AZURE_OPENAI_API_KEYAZURE_FOUNDRY_OPENAI_ENDPOINT/AZURE_FOUNDRY_ANTHROPIC_ENDPOINTmodels.providersconfigNotes
azure-ai-foundrysince the API endpoints still use Azure domains. Can be aliased/renamed later.0(typical for Azure credit/sponsorship users). Users on pay-as-you-go can override via explicit config.Test plan
tsc --noEmit)vitest run)openclaw models listshows all catalog models with authCloses #6056
Greptile Overview
Greptile Summary
This PR adds a new Azure AI Foundry provider integration by:
azure-ai-foundry/azure-ai-foundry-anthropicwith an env-var fallback chain.AZURE_FOUNDRY_*_ENDPOINTenv vars and a static model catalog.The new providers plug into the existing
resolveImplicitProvidersflow (used byensureOpenClawModelsJsonto write/mergemodels.json) and into the shared env auth resolution insrc/agents/model-auth.ts.Confidence Score: 2/5
resolveImplicitProviderscurrently assigns the env var name (e.g.AZURE_AI_FOUNDRY_API_KEY) as the apiKey value, which will break authentication when the implicit providers are used. Aside from that, the changes are localized and consistent with existing provider patterns.resolveEnvApiKeyVarName.(2/5) Greptile learns from your feedback when you react with thumbs up/down!