Add MCP message inspector to dev apps UI#3570
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 3c6996b73e
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| if request.method == "POST" and chunks: | ||
| _log_response_bytes( | ||
| message_log, | ||
| b"".join(chunks), | ||
| resp.headers.get("content-type", ""), |
There was a problem hiding this comment.
Capture GET SSE traffic in the inspector
FastMCP's streamable-HTTP app keeps GET enabled for the standalone SSE notification stream (src/fastmcp/server/http.py:329-353), but the proxy only feeds _log_response_bytes(...) when request.method == "POST". In any stateful session that uses server-initiated notifications—such as task status or list-change events—the new inspector will show none of that traffic, so it does not actually display all MCP messages flowing through /mcp.
Useful? React with 👍 / 👎.
| chunks: list[bytes] = [] | ||
| try: | ||
| async for chunk in resp.aiter_bytes(): | ||
| chunks.append(chunk) | ||
| yield chunk |
There was a problem hiding this comment.
Log SSE POST events before the stream closes
The Streamable HTTP transport allows a POST to return text/event-stream and send JSON-RPC notifications/requests before the final response. Here each chunk is buffered into chunks and parsed only from the finally block, so long-running or streaming tool calls will not appear in the panel until the server closes the stream; if the stream is kept open for polling/resumption, they never appear in real time at all. That breaks the new inspector's main debugging path for streaming responses.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 4a83c1dc08
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| window.addEventListener("message", function(event) { | ||
| var data = event.data; | ||
| if (typeof data === "string") { | ||
| try { data = JSON.parse(data); } catch(e) { return; } | ||
| } | ||
| if (!data || typeof data !== "object") return; | ||
| if (!data.jsonrpc && !data.method) return; | ||
| fetch("/api/logs/bridge", { | ||
| method: "POST", | ||
| headers: {"Content-Type": "application/json"}, | ||
| body: JSON.stringify({body: data}) | ||
| }); |
There was a problem hiding this comment.
Log host-to-iframe AppBridge traffic
On /launch, the messages developers most need to inspect are the host-side bridge.sendToolInput(...) / bridge.sendToolResult(...) calls at src/fastmcp/cli/apps_dev.py:369-372. This hook only listens for window.message, which fires for messages received by the host, not for postMessage calls sent from the host to the iframe, so those outgoing payloads never reach /api/logs/bridge. In any app where the renderer gets the wrong args or result, the inspector will show only half of the AppBridge exchange and miss the critical payload entirely.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: dfff699e74
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| # Server-initiated notifications have "method" but no "id" | ||
| if "method" in body and "id" not in body: |
There was a problem hiding this comment.
Handle server-initiated MCP requests separately
FastMCP tools can legitimately send server-initiated JSON-RPC requests to the client, e.g. sampling via context.session.create_message() (src/fastmcp/server/sampling/run.py:520-528) and elicitation via self.session.elicit() (src/fastmcp/server/context.py:1135-1140). Those frames have both method and id, but this branch only treats method-without-id as non-responses, so the inspector records them as ordinary responses and drops the method name. In any app that uses sampling or elicitation, the new panel will hide the actual server request developers are trying to debug.
Useful? React with 👍 / 👎.
| var levelOrder = ["debug", "info", "warning", "error"]; | ||
| var minLevel = 0; |
There was a problem hiding this comment.
Include every supported MCP log level in the filter order
FastMCP accepts MCP log levels beyond debug/info/warning/error (src/fastmcp/server/context.py:112-120, 548-555), including notice, critical, alert, and emergency. shouldShow() compares each notification's level with levelOrder.indexOf(lv), so any of those valid levels gets -1 and is filtered out even on the default Debug+ setting. That makes the inspector silently hide legitimate ctx.log(...) output for some of the most important severities.
Useful? React with 👍 / 👎.
Adds a right-side inspector panel to `fastmcp dev apps` that captures all MCP communication for debugging. Shows JSON-RPC requests/responses flowing through the proxy (blue/green arrows) and AppBridge postMessage traffic between the Prefab app and host (purple arrows). Entries show method, timing, smart summaries, and expand to full JSON on click. Panel is resizable by dragging and dismissible.
Log messages from ctx.log() and other server-initiated notifications now appear in the inspector panel with an orange ↓ arrow, distinct from request/response traffic.
…o app Move panel to left side, add category filters (segmented toggle colored by message type) and log level dropdown, show level badges on notifications, highlight errors with red tint, flash new entries, add per-entry copy button, fix notification summaries, add inspector demo app.
cb96aab to
218690d
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 218690dbf0
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| chunks: list[bytes] = [] | ||
| try: | ||
| async for chunk in resp.aiter_bytes(): | ||
| chunks.append(chunk) | ||
| yield chunk |
There was a problem hiding this comment.
Stop buffering entire streamed MCP responses
_stream_and_cleanup() now appends every response chunk to chunks and only inspects them from finally. That means any response which stays open for a while—most notably the stateful standalone GET /mcp SSE stream, but also long-running POST SSE responses—keeps growing an in-memory copy inside the proxy even though the bytes have already been forwarded to the browser. In a notification-heavy session this can make fastmcp dev apps consume unbounded memory and eventually stall or OOM; the inspector needs incremental parsing instead of whole-stream buffering.
Useful? React with 👍 / 👎.
| if isinstance(req_json, list): | ||
| for item in req_json: | ||
| message_log.log_request(item) | ||
| else: | ||
| message_log.log_request(req_json) |
There was a problem hiding this comment.
Guard request logging against non-object JSON bodies
This logging path assumes every parsed POST payload or batch element is a dict, but malformed JSON-RPC such as 1, null, or [1] is still valid JSON and should be forwarded so the MCP server can return a protocol error. Here message_log.log_request(...) will call .get() on the scalar and raise AttributeError, so the dev proxy returns a 500 before the request ever reaches the server.
Useful? React with 👍 / 👎.
Guard request logging against non-dict JSON bodies that would crash with AttributeError. Parse SSE events as they arrive instead of buffering the entire stream, avoiding unbounded memory growth on long-lived connections.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 2a2f10fe41
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| sse_buf += chunk.decode("utf-8", errors="replace") | ||
| while "\n\n" in sse_buf: | ||
| event, sse_buf = sse_buf.split("\n\n", 1) | ||
| for line in event.splitlines(): | ||
| if line.startswith("data: "): |
There was a problem hiding this comment.
Parse SSE frames with CRLF separators
FastMCP's streamable-HTTP server emits SSE via mcp.server.streamable_http.StreamableHTTPServerTransport, which uses sse_starlette.EventSourceResponse; sse_starlette.event.ServerSentEvent.DEFAULT_SEPARATOR is "\r\n". Because this parser only looks for "\n\n", sse_buf never reaches a frame boundary for real FastMCP SSE traffic, so message_log.log_response(...) is never called. That makes streaming POST responses and server notifications/progress sent over SSE disappear from the inspector even though the proxy forwards them successfully.
Useful? React with 👍 / 👎.
| function poll() { | ||
| fetch("/api/logs?since=" + lastId) | ||
| .then(function(r) { return r.ok ? r.json() : []; }) |
There was a problem hiding this comment.
Serialize log polling to avoid duplicate inspector rows
In any slow tab or busy dev server, setInterval(poll, 500) can start a second /api/logs request before the previous one resolves. Since each request captures the same lastId and there is no in-flight guard or dedupe, whichever response arrives second will append entries that were already rendered by the first response, inflating both the counter and the visible log list.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d7933c58e5
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| self._request_methods[jsonrpc_id] = method | ||
| self._request_times[jsonrpc_id] = timestamp |
There was a problem hiding this comment.
Correlate requests by session as well as JSON-RPC id
_MessageLog is shared across the whole dev server, while both the picker host and each /launch page create independent MCP Clients. Here the outstanding-request maps are keyed only by body["id"], so two concurrent sessions that both emit id=0/id=1 will overwrite each other and the next response will pick up the wrong method/timing. In practice, opening two dev tabs or keeping the picker open while launching an app makes the inspector mislabel responses, which defeats the point of showing request/response pairs.
Useful? React with 👍 / 👎.
When developing FastMCP apps with
fastmcp dev apps, the main debugging question is: "did my Prefab send the right thing, and did the server respond correctly?" Until now, that meant sprinkling print statements or guessing.This adds a branded inspector panel on the right side of the dev UI that captures all MCP traffic in real time. It intercepts JSON-RPC messages flowing through the
/mcpproxy (requests and responses) and AppBridgepostMessagetraffic between the Prefab iframe and host page (likemcp/addContextormcp/createMessage).Each entry shows direction, method, timing, and a smart summary. Click to expand the full JSON-RPC body. The panel is resizable by dragging its left edge, dismissible with a close button, and auto-scrolls to new messages unless you've scrolled up to inspect older ones.