Skip to content

feat(memory-lancedb): native Azure OpenAI support#47285

Open
Restry wants to merge 1 commit intoopenclaw:mainfrom
Restry:feature/azure-openai-support
Open

feat(memory-lancedb): native Azure OpenAI support#47285
Restry wants to merge 1 commit intoopenclaw:mainfrom
Restry:feature/azure-openai-support

Conversation

@Restry
Copy link
Copy Markdown

@Restry Restry commented Mar 15, 2026

This PR adds native Azure OpenAI support to the memory-lancedb plugin.

It automatically detects Azure endpoints (via ) and injects the required header and query parameter (defaulting to ).

This allows users to use Azure OpenAI for embeddings without needing an intermediate proxy like LiteLLM or OneAPI, significantly reducing friction for enterprise users.

@Restry
Copy link
Copy Markdown
Author

Restry commented Mar 15, 2026

This PR (re)implements native Azure OpenAI support for the plugin, addressing the feedback from closed PR #25.

Key Features:

  1. Auto-Detection: Automatically detects Azure endpoints by checking for in .
  2. Seamless Auth: Injects the required header instead of .
  3. API Versioning: Injects the required query parameter (defaults to stable).

Why Native?
While proxy solutions (LiteLLM/OneAPI) exist, they introduce significant friction for enterprise users who just want to point to their Azure resource. This change is minimal, safe, and dramatically improves the out-of-the-box experience for Azure users.

I am committed to adding tests and making configurable if this direction is accepted.

@Restry
Copy link
Copy Markdown
Author

Restry commented Mar 15, 2026

This PR (re)implements native Azure OpenAI support for the memory-lancedb plugin, addressing the feedback from closed PR #25.

Key Features:

  1. Auto-Detection: Automatically detects Azure endpoints by checking for .openai.azure.com in baseUrl.
  2. Seamless Auth: Injects the required api-key header instead of Authorization: Bearer.
  3. API Versioning: Injects the required api-version query parameter (defaults to 2024-02-01 stable).

Why Native?
While proxy solutions (LiteLLM/OneAPI) exist, they introduce significant friction for enterprise users who just want to point baseUrl to their Azure resource. This change is minimal, safe, and dramatically improves the out-of-the-box experience for Azure users.

I am committed to adding tests and making api-version configurable if this direction is accepted.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps bot commented Mar 15, 2026

Greptile Summary

This PR adds native Azure OpenAI support to the memory-lancedb plugin's Embeddings class by detecting Azure endpoints (via .openai.azure.com in baseUrl) and automatically injecting the api-key header and api-version query parameter required by Azure OpenAI's REST API. This is a clean, minimal change that avoids requiring a proxy such as LiteLLM.

  • The detection heuristic (baseUrl?.includes(".openai.azure.com")) is reliable for all standard Azure OpenAI resource URLs.
  • The api-key header injection is the correct mechanism for Azure authentication through the OpenAI SDK.
  • The api-version is hardcoded to "2024-02-01" — one of the earlier GA releases — with no way for users to override it via config. config.ts's assertAllowedKeys does not accept an apiVersion field, so users on tenants that deprecate 2024-02-01, or those needing features from a newer API version, currently have no path to configure this.

Confidence Score: 4/5

  • PR is safe to merge; it adds correct Azure auth headers but pins a specific API version that cannot be overridden by users.
  • The Azure detection and header injection logic is functionally correct and will work for current Azure OpenAI embedding models. The single concern is the hardcoded api-version: "2024-02-01" with no config-level escape hatch, which is a maintainability issue rather than a correctness bug today.
  • extensions/memory-lancedb/index.ts (line 179) and the companion extensions/memory-lancedb/config.ts (which would need an apiVersion field added to support user-configurable versions).
Prompt To Fix All With AI
This is a comment left during a code review.
Path: extensions/memory-lancedb/index.ts
Line: 179

Comment:
**Hardcoded `api-version` with no override path**

The Azure API version is pinned to `"2024-02-01"`, which is one of the earlier GA releases. Azure OpenAI regularly releases new API versions and eventually retires old ones. Users who need features from newer versions (or whose tenant deprecates `2024-02-01`) will have no recourse short of forking the plugin.

Looking at `config.ts`, the `assertAllowedKeys` guard at line 108 only allows `["apiKey", "model", "baseUrl", "dimensions"]`, so there is currently no way to pass an `apiVersion` setting through the plugin config.

Consider:
1. Adding an optional `apiVersion` field to the `embedding` section of `config.ts` (and updating `assertAllowedKeys` to accept it), and
2. Using that value here with a sensible default.

```suggestion
            defaultQuery: { "api-version": "2024-08-01-preview" },
```

At minimum, bumping the default to a more recent stable version reduces the risk of hitting a sudden deprecation.

How can I resolve this? If you propose a fix, please make it concise.

Last reviewed commit: 805ad58

...(isAzure
? {
defaultHeaders: { "api-key": apiKey },
defaultQuery: { "api-version": "2024-02-01" },
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hardcoded api-version with no override path

The Azure API version is pinned to "2024-02-01", which is one of the earlier GA releases. Azure OpenAI regularly releases new API versions and eventually retires old ones. Users who need features from newer versions (or whose tenant deprecates 2024-02-01) will have no recourse short of forking the plugin.

Looking at config.ts, the assertAllowedKeys guard at line 108 only allows ["apiKey", "model", "baseUrl", "dimensions"], so there is currently no way to pass an apiVersion setting through the plugin config.

Consider:

  1. Adding an optional apiVersion field to the embedding section of config.ts (and updating assertAllowedKeys to accept it), and
  2. Using that value here with a sensible default.
Suggested change
defaultQuery: { "api-version": "2024-02-01" },
defaultQuery: { "api-version": "2024-08-01-preview" },

At minimum, bumping the default to a more recent stable version reduces the risk of hitting a sudden deprecation.

Prompt To Fix With AI
This is a comment left during a code review.
Path: extensions/memory-lancedb/index.ts
Line: 179

Comment:
**Hardcoded `api-version` with no override path**

The Azure API version is pinned to `"2024-02-01"`, which is one of the earlier GA releases. Azure OpenAI regularly releases new API versions and eventually retires old ones. Users who need features from newer versions (or whose tenant deprecates `2024-02-01`) will have no recourse short of forking the plugin.

Looking at `config.ts`, the `assertAllowedKeys` guard at line 108 only allows `["apiKey", "model", "baseUrl", "dimensions"]`, so there is currently no way to pass an `apiVersion` setting through the plugin config.

Consider:
1. Adding an optional `apiVersion` field to the `embedding` section of `config.ts` (and updating `assertAllowedKeys` to accept it), and
2. Using that value here with a sensible default.

```suggestion
            defaultQuery: { "api-version": "2024-08-01-preview" },
```

At minimum, bumping the default to a more recent stable version reduces the risk of hitting a sudden deprecation.

How can I resolve this? If you propose a fix, please make it concise.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 805ad58f05

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

private dimensions?: number,
) {
this.client = new OpenAI({ apiKey, baseURL: baseUrl });
const isAzure = baseUrl?.includes(".openai.azure.com");
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Handle Azure AI Foundry hosts in endpoint detection

The new Azure detection only checks for .openai.azure.com, so valid Azure AI Foundry endpoints such as https://<resource>.services.ai.azure.com/... are not treated as Azure and therefore do not get the required api-key header / api-version query injection. OpenClaw already classifies both host families as Azure in src/commands/onboard-custom.ts, so this mismatch will cause memory embedding calls against *.services.ai.azure.com to be sent with OpenAI-style auth and fail at runtime.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant