Skip to content

feat: add Azure AI Foundry provider support#6969

Closed
alexpwrd wants to merge 2 commits intoopenclaw:mainfrom
alexpwrd:feat/azure-ai-foundry-provider
Closed

feat: add Azure AI Foundry provider support#6969
alexpwrd wants to merge 2 commits intoopenclaw:mainfrom
alexpwrd:feat/azure-ai-foundry-provider

Conversation

@alexpwrd
Copy link
Copy Markdown

@alexpwrd alexpwrd commented Feb 2, 2026

Summary

  • Adds native Azure AI Foundry (now Microsoft Foundry) provider support, addressing [Feature Request] Add native Azure OpenAI / Azure AI Foundry as model provider #6056
  • Two provider IDs sharing one API key: azure-ai-foundry (OpenAI-compatible) and azure-ai-foundry-anthropic (Anthropic-compatible)
  • Static model catalog with standard Azure model IDs (GPT-5/4.1/4o series, Claude 4.5/4/3.5 series, o3/o4 reasoning models)
  • Auth fallback chain: AZURE_AI_FOUNDRY_API_KEYAZURE_FOUNDRY_API_KEYAZURE_OPENAI_API_KEY
  • Endpoint auto-discovery via AZURE_FOUNDRY_OPENAI_ENDPOINT / AZURE_FOUNDRY_ANTHROPIC_ENDPOINT
  • Custom deployment names supported via explicit models.providers config
  • Documentation and test included

Notes

  • Microsoft renamed Azure AI Foundry to "Microsoft Foundry" in Jan 2026. Provider ID uses azure-ai-foundry since the API endpoints still use Azure domains. Can be aliased/renamed later.
  • Model costs default to 0 (typical for Azure credit/sponsorship users). Users on pay-as-you-go can override via explicit config.
  • Follows the static catalog pattern used by Minimax/Moonshot providers.

Test plan

  • Type-check passes (tsc --noEmit)
  • Unit test passes (vitest run)
  • Tested locally with Azure AI Foundry endpoint (Claude Opus 4.5 via Anthropic-compatible API)
  • Verified openclaw models list shows all catalog models with auth
  • Verified gateway WebChat works end-to-end with Azure-hosted model

Closes #6056

Greptile Overview

Greptile Summary

This PR adds a new Azure AI Foundry provider integration by:

  • Extending auth resolution to support azure-ai-foundry / azure-ai-foundry-anthropic with an env-var fallback chain.
  • Auto-registering two implicit providers (OpenAI-compatible and Anthropic-compatible) based on AZURE_FOUNDRY_*_ENDPOINT env vars and a static model catalog.
  • Adding provider documentation and a small unit test.

The new providers plug into the existing resolveImplicitProviders flow (used by ensureOpenClawModelsJson to write/merge models.json) and into the shared env auth resolution in src/agents/model-auth.ts.

Confidence Score: 2/5

  • Not safe to merge as-is due to incorrect API key resolution for implicit providers.
  • The Azure AI Foundry provider addition is straightforward, but resolveImplicitProviders currently assigns the env var name (e.g. AZURE_AI_FOUNDRY_API_KEY) as the apiKey value, which will break authentication when the implicit providers are used. Aside from that, the changes are localized and consistent with existing provider patterns.
  • src/agents/models-config.providers.ts (implicit provider apiKey resolution), plus a quick scan of other implicit providers using resolveEnvApiKeyVarName.

(2/5) Greptile learns from your feedback when you react with thumbs up/down!

Adds native support for Azure AI Foundry (now Microsoft Foundry),
allowing users to connect OpenAI and Anthropic models hosted on Azure
under a single API key.

- Two provider IDs: azure-ai-foundry (OpenAI) and azure-ai-foundry-anthropic (Anthropic)
- Static model catalog with common Azure model IDs (GPT-5/4.1/4o series, Claude 4.5/4/3.5 series, o3/o4 reasoning)
- Auth fallback chain: AZURE_AI_FOUNDRY_API_KEY → AZURE_FOUNDRY_API_KEY → AZURE_OPENAI_API_KEY
- Endpoint discovery via AZURE_FOUNDRY_OPENAI_ENDPOINT and AZURE_FOUNDRY_ANTHROPIC_ENDPOINT
- Custom deployment names supported via explicit models.providers config

Closes openclaw#6056
@openclaw-barnacle openclaw-barnacle bot added docs Improvements or additions to documentation agents Agent runtime and tooling labels Feb 2, 2026
Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, 2 comments

Edit Code Review Agent Settings | Greptile

Comment on lines +692 to +695
const azureFoundryKey =
resolveEnvApiKeyVarName("azure-ai-foundry") ??
resolveApiKeyFromProfiles({ provider: "azure-ai-foundry", store: authStore });
if (azureFoundryKey) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

azureFoundryKey is being set to the env var name (via resolveEnvApiKeyVarName) rather than a resolved credential value. That means the provider config may contain the string name of an env var instead of an actual token, leading to authentication failures when the implicit provider is used.

This should likely mirror the other implicit providers here by resolving a concrete key value for apiKey, and only using resolveEnvApiKeyVarName when you intentionally want an env-var placeholder.

Also appears at src/agents/models-config.providers.ts:632-634, 647-649, 654-656, 661-663, 676-678, 684-686.

Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/models-config.providers.ts
Line: 692:695

Comment:
`azureFoundryKey` is being set to the env var *name* (via `resolveEnvApiKeyVarName`) rather than a resolved credential value. That means the provider config may contain the string name of an env var instead of an actual token, leading to authentication failures when the implicit provider is used.

This should likely mirror the other implicit providers here by resolving a concrete key value for `apiKey`, and only using `resolveEnvApiKeyVarName` when you intentionally want an env-var placeholder.

Also appears at src/agents/models-config.providers.ts:632-634, 647-649, 654-656, 661-663, 676-678, 684-686.

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This follows the same pattern as all other implicit providers in this file (Minimax, Moonshot, Ollama, Venice, Xiaomi, Synthetic) — see lines 633, 648, 661, 676, 684. resolveEnvApiKeyVarName returns the env var name intentionally; the runtime resolves it downstream. Not a regression.

Comment on lines +98 to +101
"azure-ai-foundry-anthropic": {
baseUrl: "https://your-resource.openai.azure.com/anthropic",
apiKey: "${AZURE_AI_FOUNDRY_API_KEY}",
api: "anthropic-messages",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The example uses an apiKey value that looks like an env-var interpolation. The runtime normalizer strips that wrapper and stores just the env var name, which is fine if users go through the normal models.json generation path; but if someone copies this into a context expecting a literal key value, it may be confusing. Consider adding a short note that this apiKey form is treated as an env-var reference and normalized accordingly.

Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!

Prompt To Fix With AI
This is a comment left during a code review.
Path: docs/providers/azure-ai-foundry.md
Line: 98:101

Comment:
The example uses an `apiKey` value that looks like an env-var interpolation. The runtime normalizer strips that wrapper and stores just the env var name, which is fine if users go through the normal `models.json` generation path; but if someone copies this into a context expecting a literal key value, it may be confusing. Consider adding a short note that this `apiKey` form is treated as an env-var reference and normalized accordingly.

<sub>Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!</sub>

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same pattern as the Venice provider docs (line 243). This is the standard convention for env-var references in explicit config examples throughout the project.

@Takhoffman
Copy link
Copy Markdown
Contributor

Hello, have you tested this with some of the other models that azure provides too such as deepseek?

@Takhoffman Takhoffman self-assigned this Feb 2, 2026
@timmoh
Copy link
Copy Markdown

timmoh commented Feb 3, 2026

great work. @alexpwrd @Takhoffman is this coming to release soon?

@shudonglin
Copy link
Copy Markdown

when will this feature release? i am waiting for the Azure Foundry support as well.

@davidhgd
Copy link
Copy Markdown

davidhgd commented Feb 5, 2026

Thank you @alexpwrd great work!

@rafalzawadzki
Copy link
Copy Markdown

really looking for this to get in
previous contributions to solve this were automatically closed despite the need

@nightfullstar
Copy link
Copy Markdown

Really looking for this to get in as well

@sebslight
Copy link
Copy Markdown
Member

Closing as duplicate of #12059. If this is incorrect, comment and we can reopen.

@sebslight sebslight closed this Feb 13, 2026
@schupat
Copy link
Copy Markdown

schupat commented Feb 13, 2026

@sebslight I don't get it, why close this one and not the #12059?
This one is open for almost two weeks the other one just a couple days.

@Sockolet
Copy link
Copy Markdown

Can this be reopened and merged, please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

agents Agent runtime and tooling docs Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature Request] Add native Azure OpenAI / Azure AI Foundry as model provider

10 participants