NotebookLM-quality document Q&A — but through MCP, powered by the official Gemini File Search API.
You upload your documents. You ask a question. Every sentence in the answer cites the exact passage from your files. No hallucination, no guessing, no "I think the document says...". Just grounded answers with receipts.
Every existing NotebookLM MCP server reverse-engineers Google's internal web UI. They scrape the page with Playwright, steal cookies, and pray Google doesn't change a CSS selector. Spoiler: Google changes things. Your MCP breaks. Your Google account gets flagged.
This one uses the official Gemini File Search API. It doesn't scrape anything. It doesn't need your Google login. Just an API key from aistudio.google.com.
Upload PDFs, DOCX, TXT, HTML, or any of 80+ supported formats. Ask questions across one or multiple document stores. Get answers like this:
The maximum liability is limited to the total fees paid
in the preceding twelve months [1]. Clients must pay all
invoices within forty-five days of receipt [2].
---
Citations:
[1] Contract_A.pdf: "The maximum liability under this agreement
shall not exceed the total fees paid in the preceding
twelve (12) months." (confidence: 92%)
[2] Contract_A.pdf: "The Client shall pay all invoices within
forty-five (45) days of receipt." (confidence: 88%)
Every [1], [2] maps to the exact passage from your document. Not a summary. The actual text.
# Set your Gemini API key
export GEMINI_API_KEY=your-key-from-aistudio.google.com
# Add to Claude Code
claude mcp add notebooklm-api npx @mcpware/notebooklm-api
# Or add to Claude Desktop (~/.claude/mcp.json)
{
"mcpServers": {
"notebooklm-api": {
"command": "npx",
"args": ["@mcpware/notebooklm-api"],
"env": { "GEMINI_API_KEY": "your-key" }
}
}
}Then just talk to Claude naturally. The three-step workflow:
Step 1: Create a store (one time per topic)
"Create a store called 'Legal Docs'"
Step 2: Upload your files (one time per topic)
"Upload all PDFs from ~/contracts/ to the Legal Docs store"
Step 3: Ask questions (as many as you want)
"What are the liability clauses across all contracts?" "Which contract has the shortest termination notice period?" "Does any contract mention non-compete?"
Claude calls create_store, upload_document, and ask automatically. You don't need to know store names or API details. Each store is independent, so you can have separate stores for different topics (legal docs, tax records, research papers, etc.) and query them individually or together.
| Tool | What it does |
|---|---|
create_store |
Create a document store |
list_stores |
List all stores |
upload_document |
Upload a file (PDF, DOCX, TXT, etc.) to a store. Waits for indexing to finish. |
list_documents |
List documents in a store |
ask |
Ask a question. Returns answer with per-passage citations. |
delete_document |
Remove a document |
delete_store |
Remove a store |
When you call ask, the Gemini File Search API does three things:
- Searches your uploaded documents for relevant passages (vector similarity, not keyword match)
- Generates an answer grounded in those passages
- Returns
groundingSupports— a map from each sentence in the answer to the exact source passage
We parse that map into inline [1], [2] markers and a citation list. This is the part every competitor gets wrong — they dump the raw API response without parsing it. We give you something you can actually read and verify.
You should, if the web UI works for you. This tool is for when you want:
- Document Q&A inside your coding agent (Claude Code, Cursor, etc.)
- Programmatic access — upload 100 documents, ask 50 questions, extract structured data
- Stability — no browser automation, no cookies, no scraping
- Your own API key — you control the cost and the data
On the free tier, Google may use your data to improve products. On the paid tier (link a billing account), they don't. If your documents are sensitive, use the paid tier. Details.
- Go to aistudio.google.com
- Click "Get API key"
- Set it as
GEMINI_API_KEY
The free tier gives you enough to try everything. Paid tier for production use.
MIT