fix(init): wizard probes openai Python client (closes #407)#408
Merged
fix(init): wizard probes openai Python client (closes #407)#408
Conversation
The wizard's OpenAI branch validated the API key over the network but
skipped the ``_have_module("openai")`` probe that the other three
provider/tokenizer axes use. Meanwhile ``mm init -y --provider openai``
refuses upfront when the ``openai`` Python client is missing (#402). A
user who ran the wizard, watched a successful API-key test, and then
copied the same choice into a scripted ``-y`` install hit the same
discoverability gap #405 closed for Ollama/ONNX/kiwipiepy.
Mirror the Ollama pattern: before prompting for the API key, warn if
the client is missing, surface the install hint + ``-y`` refuse note,
and mark ``openai`` inline-warned so the post-wizard summary doesn't
duplicate. The key-validation + save flow runs regardless — the user
still gets a clean check that their key works.
Pin test extended with a fourth axis (count >= 1 for openai).
Co-Authored-By: Claude <[email protected]>
#408) All three preceding branches print "Install with → Saving config now → -y refuse note" in that order. OpenAI had save-msg after the refuse hint, which is cosmetic but breaks reader expectation. Co-Authored-By: Claude <[email protected]>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #407 (follow-up to #405 / #403 / #402).
Problem
Three of the four provider/tokenizer axes in the wizard already probe the Python client before prompting:
import fastembed→ warn + save +_y_refuse_hint_ollama_available()(binary) +_have_module("ollama")(Python) → warn + save +_y_refuse_hint(fix(init): wizard fallback mentions -y refuse semantic (closes #403) #405)import kiwipiepy→ warn +_y_refuse_hintThe OpenAI path didn't — it only validated the API key (
_test_openai_key, network). Meanwhilemm init -y --provider openairefuses upfront if theopenaiPython client is missing (_collect_missing_extras). So:openaipackage + valid key → silent pass, config saved, runtime falls back or errors later.-y --provider openai+ noopenaipackage → refuses upfront.A user who copied a successful wizard choice into a scripted
-yinstall hit the same discoverability gap #405 closed for the other three axes.Change
Mirror the Ollama pattern in the OpenAI branch, before the API-key prompt:
Key validation + save still run after the warning — the user still gets a clean key-check and the warn-and-save semantic is preserved.
_extras_warned_inlinemarker prevents the post-wizard summary from duplicating.Pin test extended with a fourth axis (
_count_calls(embedding_src, "--provider openai", "openai") >= 1).Test plan
uv run pytest packages/memtomem/tests/test_init_cmd.py -k "YRefuseHint": 2 passed.test_init_cmd.py: 202 passed.ruff check+ruff format --check: clean.Scope
Four-axis coverage now complete. No behavior change for users who have the
openaipackage installed. No change to-ysemantics.