fix(#1105): allow custom_providers hostnames through SSRF check#1113
Closed
bergeouss wants to merge 1 commit intonesquena:masterfrom
Closed
fix(#1105): allow custom_providers hostnames through SSRF check#1113bergeouss wants to merge 1 commit intonesquena:masterfrom
bergeouss wants to merge 1 commit intonesquena:masterfrom
Conversation
- Build trusted hostname set from custom_providers[].base_url in config.yaml - These are user-explicitly configured endpoints — not SSRF risks - Hardcoded allowlist (ollama, localhost, 127.0.0.1, lmstudio) still active - Unknown private IPs still blocked - Add 7 tests (5 source analysis + 2 functional with mocked socket)
Collaborator
|
Merged in v0.50.221 via PR #1117. Thank you @bergeouss — great contribution (SSRF custom providers fix)! 🎉 |
1 similar comment
Collaborator
|
Merged in v0.50.221 via PR #1117. Thank you @bergeouss — great contribution (SSRF custom providers fix)! 🎉 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Thinking Path
ollama,localhost,127.0.0.1,lmstudio,lm-studioare allowedcustom_providers— not SSRF riskscustom_providers[].base_urland add them to the trusted set before the SSRF checkWhat Changed
api/config.py: Before the SSRF check in_build_available_models_uncached(), build a_ssrf_trusted_hostsset from allcustom_providers[].base_urlentries. Theis_known_localcheck now also accepts any hostname in this set. The original hardcoded allowlist is unchanged.Why It Matters
Users running local inference servers (llama.cpp, vLLM, TabbyAPI, llama-swap) with custom hostnames get
SSRF: resolved hostname to private IPerrors. Their models never appear in the dropdown. After this fix, any endpoint explicitly configured incustom_providersis trusted.Verification
pytest tests/test_issue1105_ssrf_custom_providers.py -v— 7/7 passpytest tests/test_model_resolver.py tests/test_custom_provider_display_name.py -v— 26/27 pass (1 pre-existing failure)py_compile api/config.py— OKSecurity Note
This does NOT weaken SSRF protection. The trusted hostnames come exclusively from
custom_providersinconfig.yaml— a file the server admin controls. Unknown private IPs are still blocked.Risks / Follow-ups
modelfield populates dropdown #1106 (custom_providers models dict) — both fixes together make local model servers fully workModel Used
Closes #1105