fix: add pnpm patch for pi-ai to support LiteLLM providerType#23
Closed
tiagoefreitas wants to merge 1 commit intoopenclaw:mainfrom
Closed
fix: add pnpm patch for pi-ai to support LiteLLM providerType#23tiagoefreitas wants to merge 1 commit intoopenclaw:mainfrom
tiagoefreitas wants to merge 1 commit intoopenclaw:mainfrom
Conversation
When using Claude models via LiteLLM proxy, the bundled pi-ai sends OpenAI-specific parameters (store, max_completion_tokens) that cause errors with non-OpenAI backends like AWS Bedrock. This patch adds providerType support to openai-completions.js: - Check model.providerType to determine if model is OpenAI-native - Only send OpenAI-specific params to actual OpenAI models - Use max_tokens instead of max_completion_tokens for proxy models Users can now configure custom models in ~/.pi/agent/models.json with providerType: "anthropic" (or "google", etc.) to properly route requests through LiteLLM to Claude and other non-OpenAI models. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Contributor
|
Interesting addition! You can maintain your own fork of pi-ai like I do in the config: Also try upstreaming your changes to pi-ai! That's the best long-term solution. |
Pastorsimon1798
pushed a commit
to Pastorsimon1798/openclaw
that referenced
this pull request
Feb 1, 2026
- Create analyze.py (profile, describe, groupby, pivot, filter) - Create excel.py (read, write, list-sheets, merge/VLOOKUP) - Create visualize.py (bar, line, scatter, heatmap, histogram, pie, box) - Update SKILL.md with venv instructions - Mark openclaw#14 Blogwatcher RESOLVED (installed) - Mark openclaw#15 Data Analytics RESOLVED - Mark openclaw#17 Calendar CANCELLED (using GOG) - Mark openclaw#18 Edison PAUSED - Mark openclaw#23 Network Fallback PAUSED
dgarson
referenced
this pull request
in dgarson/clawdbot
Feb 2, 2026
feat(ui): add skeleton loading screens for overseer and cron views
alexprime1889-prog
pushed a commit
to alexprime1889-prog/moltbot
that referenced
this pull request
Feb 8, 2026
…t entrypoint script
slathrop
referenced
this pull request
in slathrop/openclaw-js
Feb 11, 2026
Tasks completed: 2/2 - Port PR/issue guides + Install sharpening (commits #15, #17, #23) - Port PR sign-off + guide revisions + markdownlint fixes (commits #24, #26, #35) SUMMARY: .planning/phases/17-docs/17-03-SUMMARY.md
frodo-harborbot
added a commit
to harborworks/openclaw
that referenced
this pull request
Feb 16, 2026
…openclaw#23) The database service's environment block used ${DATABASE_PASSWORD:-postgres} which is interpolated by docker compose from the host shell — where the var is never set, so it always defaults to 'postgres'. The backend reads .env.prod and gets the real password, causing a mismatch. Fix: remove the environment block from the database service and pass POSTGRES_DB, POSTGRES_USER, POSTGRES_PASSWORD directly via .env.prod. The deploy workflow now writes these vars into .env.prod alongside the existing DATABASE_* vars.
|
@claude please review this PR |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
@mariozechner/pi-aito supportproviderTypefield for custom modelsstore,max_completion_tokens) that cause errors with non-OpenAI backends like AWS Bedrockmodel.providerTypeto determine if the model is OpenAI-native before sending OpenAI-specific paramsProblem
When configuring custom models in
~/.pi/agent/models.jsonthat route through LiteLLM to Claude (e.g. AWS Bedrock), the bundled pi-ai would send:store: false(OpenAI-specific)max_completion_tokensinstead ofmax_tokensThis caused errors like:
Solution
The patch adds
providerTypesupport toopenai-completions.js:model.providerType === "openai"or the baseUrl indicates OpenAI, send OpenAI-specific paramsmax_tokens)Users can now configure custom models with
providerType: "anthropic"(or"google", etc.) to properly route requests.Test plan
🤖 Generated with Claude Code