You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Part of #642. (Linked as a sub-issue of the parent epic.)
Goal
Integrate Cursor CLI (or other officially documented Cursor headless/agent entrypoint) as an LLM_PROVIDER via the shared non-interactive CLI path (LLMCLIAdapter + CLIBackedLLMClient), analogous to OpenAI Codex (codex exec-style: one-shot invocation, no REPL, no approval prompts, suitable for subprocess.run without a TTY).
Non-interactive requirement (must-have)
Same contract as Codex CLI: the integration must use a single, scriptable CLI invocation that accepts the flattened prompt (stdin and/or argv from build()), runs to completion, and exits with a clear status code—not an interactive session, wizard, or tool that blocks on user input.
Spike first: Cursor is IDE-first; confirm a supported non-interactive command before building a full adapter. If no headless contract exists, keep this issue as spike-only and stop.
Spike (if needed): confirm the exact subcommand(s) and flags for the non-interactive path; document binary name(s), auth mechanism, and how to probe “logged in vs not vs unknown”.
Adapter in app/integrations/llm_cli/ implementing LLMCLIAdapter (detect, build, parse, explain_failure).
Registry entry in registry.py (adapter_factory, model_env_key) + LLMProvider / validators in app/config.py (same string as registry key).
Subprocess env: extend _SAFE_SUBPROCESS_ENV_PREFIXES in runner.py when the CLI relies on vendor env vars; add tests that required keys are forwarded.
Wizard (optional):ProviderOption with credential_kind="cli" and adapter_factory in app/cli/wizard/config.py.
Tests under tests/integrations/llm_cli/ (detect/build/failure; mock subprocess / shutil.which as appropriate).
User path (Quick Start):opensre onboard then opensre investigate works with this LLM_PROVIDER, matching Tracer-Cloud/opensre#quick-start (README sample alert/fixture or a documented equivalent).
Cross-OS (Windows, Linux, macOS): binary resolution, env forwarding, and non-interactive invocation behave correctly on each OS the vendor CLI supports; extend tests for platform-specific branches (e.g. *.exe, path rules) like other CLI adapters. If the vendor ships only some OSes, state that explicitly in the PR.
Acceptance criteria
detect() never raises; returns CLIProbe with logged_in in {True, False, None} per the three-state pattern in app/integrations/llm_cli/AGENTS.md (auth confirmed / not authenticated / unclear).
build() produces CLIInvocation that the runner can execute with subprocess.run (no interactive/TUI assumptions); timeout and cwd are sensible and documented.
parse / explain_failure handle success and failure without leaking secrets.
Binary resolution uses resolve_cli_binary with an explicit *_BIN-style override env + PATH + fallbacks, consistent with other CLI adapters.
LLM_PROVIDER=... selects the new client via CLI_PROVIDER_REGISTRY; optional model env mirrors CODEX_MODEL semantics (empty/unset → omit override; CLI default applies).
CI: repository quality gates pass (make lint, make typecheck, make test-cov or the usual PR checks).
End-to-end demo: Share a recording, GIF, or numbered steps that follow Quick Start: opensre onboard (this provider selected/configured), then opensre investigate with the README sample or an equivalent fixture, through a completed investigation.
Cross-OS: Verify Windows, Linux, and macOS for the onboarding + investigate path (or document vendor/OS limits and how we degrade). PR should mention what was exercised on each OS or point to CI matrix coverage.
.env.example: Add commented entries for <PROVIDER>_BIN, <PROVIDER>_MODEL, and any other user-facing env vars the integration introduces.
Implementation reference
This repo:app/integrations/llm_cli/AGENTS.md and app/integrations/llm_cli/codex.py (reference implementation).
Non-interactive exec mental model (one link):OpenAI Codex CLI — one-shot / non-TTY usage comparable to our codex exec adapter; vendor flags will differ.
Out of scope
Interactive-only CLIs with no supported non-interactive mode.
Long-running agent UIs, approval loops, or integrations that do not map to a single invoke call.
Parent epic
Part of #642. (Linked as a sub-issue of the parent epic.)
Goal
Integrate Cursor CLI (or other officially documented Cursor headless/agent entrypoint) as an
LLM_PROVIDERvia the shared non-interactive CLI path (LLMCLIAdapter+CLIBackedLLMClient), analogous to OpenAI Codex (codex exec-style: one-shot invocation, no REPL, no approval prompts, suitable forsubprocess.runwithout a TTY).Non-interactive requirement (must-have)
build()), runs to completion, and exits with a clear status code—not an interactive session, wizard, or tool that blocks on user input.Vendor documentation (required)
Cursor CLI overview (official): https://cursor.com/docs/cli/overview
Scope
app/integrations/llm_cli/implementingLLMCLIAdapter(detect,build,parse,explain_failure).registry.py(adapter_factory,model_env_key) +LLMProvider/ validators inapp/config.py(same string as registry key)._SAFE_SUBPROCESS_ENV_PREFIXESinrunner.pywhen the CLI relies on vendor env vars; add tests that required keys are forwarded.ProviderOptionwithcredential_kind="cli"andadapter_factoryinapp/cli/wizard/config.py.tests/integrations/llm_cli/(detect/build/failure; mocksubprocess/shutil.whichas appropriate).opensre onboardthenopensre investigateworks with thisLLM_PROVIDER, matching Tracer-Cloud/opensre#quick-start (README sample alert/fixture or a documented equivalent).*.exe, path rules) like other CLI adapters. If the vendor ships only some OSes, state that explicitly in the PR.Acceptance criteria
detect()never raises; returnsCLIProbewithlogged_inin{True, False, None}per the three-state pattern inapp/integrations/llm_cli/AGENTS.md(auth confirmed / not authenticated / unclear).build()producesCLIInvocationthat the runner can execute withsubprocess.run(no interactive/TUI assumptions); timeout and cwd are sensible and documented.parse/explain_failurehandle success and failure without leaking secrets.Binary resolution uses
resolve_cli_binarywith an explicit*_BIN-style override env + PATH + fallbacks, consistent with other CLI adapters.LLM_PROVIDER=...selects the new client viaCLI_PROVIDER_REGISTRY; optional model env mirrorsCODEX_MODELsemantics (empty/unset → omit override; CLI default applies).CI: repository quality gates pass (
make lint,make typecheck,make test-covor the usual PR checks).End-to-end demo: Share a recording, GIF, or numbered steps that follow Quick Start:
opensre onboard(this provider selected/configured), thenopensre investigatewith the README sample or an equivalent fixture, through a completed investigation.Cross-OS: Verify Windows, Linux, and macOS for the onboarding + investigate path (or document vendor/OS limits and how we degrade). PR should mention what was exercised on each OS or point to CI matrix coverage.
.env.example: Add commented entries for<PROVIDER>_BIN,<PROVIDER>_MODEL, and any other user-facing env vars the integration introduces.Implementation reference
app/integrations/llm_cli/AGENTS.mdandapp/integrations/llm_cli/codex.py(reference implementation).codex execadapter; vendor flags will differ.Out of scope
invokecall.