Extract notation tables from arXiv papers using LLMs.
pip install notations-cliOr with uv:
uv add notations-cli# From arXiv ID
notations 2006.11239
# From arXiv URL
notations https://arxiv.org/abs/2006.11239
# From local .tex files
notations /path/to/tex/folder
# With options
notations 2006.11239 --model gpt-5.2-2025-12-11 --provider openai
notations 2006.11239 --model anthropic/claude-sonnet-4.5 --provider openrouter
notations 2006.11239 --output my_paper # custom base name → .json/.html/.md
notations 2006.11239 --terminal # also print table to terminal
notations 2006.11239 --no-comments # strip LaTeX comments first
notations 2006.11239 --no-expand-macros # disable macro expansion
notations 2006.11239 --no-filter-body # keep all extracted notations
# Re-render from existing JSON (no LLM call)
notations 2006_11239_notations.json
notations 2006_11239_notations.json -t # also print to terminalBy default, LaTeX macro definitions (\newcommand, etc.) are expanded inline before processing. Use --no-expand-macros to disable this.
Notations are also filtered by default to only include symbols that appear in the document body (\begin{document}...\end{document}), removing artifacts from preamble-only macro definitions. Use --no-filter-body to disable this.
Set your API key as an environment variable:
# For OpenAI
export OPENAI_API_KEY=...
# For OpenRouter
export OPENROUTER_API_KEY=...Generates a self-contained HTML file with:
- Paper metadata (title, authors, arXiv link)
- Searchable notation table
- LaTeX rendering via KaTeX
git clone https://github.com/takashiishida/notations-cli.git
cd notations-cli
uv sync
uv run pytest