AI-powered book discovery platform built with Next.js 15, LangGraph, and OpenAI.
- Intelligent Search: Search millions of books across Google Books and Open Library APIs
- Multi-language Support: Search in any language (Chinese, English, etc.)
- AI-Powered Analysis: Get AI-generated book summaries, themes, and recommendations
- Interest Matching: Discover if a book matches your interests with a personalized score
- Modern UI: Built with shadcn/ui and Tailwind CSS for a beautiful experience
- Framework: Next.js 15 (App Router)
- UI: shadcn/ui + Tailwind CSS
- AI Orchestration: LangGraph.js
- AI Models: Multi-provider support (OpenAI, Anthropic, Ollama, DeepSeek, etc.)
- Language: TypeScript
- Book APIs: Google Books API, Open Library API, Internet Archive
- Node.js 18+
- pnpm
# Install dependencies
pnpm install
# (Optional) Copy environment variables template
cp env.example .env.local
# (Optional) Install Ollama for local LLM
# Download from: https://ollama.ai
# Then run: ollama pull llama3.2
# No API keys required! The app uses Ollama by default.
# You can configure other providers in the Settings page.pnpm devOpen http://localhost:3000 to see the app.
pnpm build
pnpm startThe app uses Ollama by default, which runs locally on your machine. No API keys or internet connection required!
- Install Ollama: https://ollama.ai
- Pull a model:
ollama pull llama3.2 - Start the app:
pnpm dev
You can switch to any supported provider through the Settings page in the app:
- OpenAI (GPT-4o, GPT-4, GPT-3.5)
- Anthropic (Claude 3.5, Claude 3)
- DeepSeek (deepseek-chat, deepseek-coder)
- OpenRouter (Access 200+ models)
- Google Gemini
- Groq (Ultra-fast inference)
- Together AI, Mistral, Cohere
- Chinese Providers: Moonshot (Kimi), 智谱 AI, 百川, 零一万物, MiniMax, 硅基流动
- Custom OpenAI-compatible endpoints
If you prefer to configure via environment variables (useful for deployment):
# Example: Use OpenAI
OPENAI_API_KEY=sk-your-api-key
OPENAI_MODEL=gpt-4o-mini
# Example: Use DeepSeek
DEEPSEEK_API_KEY=your-api-key
DEEPSEEK_MODEL=deepseek-chat
# Example: Use OpenRouter
OPENROUTER_API_KEY=sk-or-your-api-key
OPENROUTER_MODEL=meta-llama/llama-3.3-70b-instruct:freeNote: Environment variables are optional. The app works out of the box with Ollama.
src/
├── app/ # Next.js App Router
│ ├── api/ # API routes
│ ├── book/[id]/ # Book detail page
│ └── page.tsx # Home page
├── components/
│ ├── ui/ # shadcn/ui components
│ ├── blocks/ # Page sections
│ └── book/ # Book-related components
├── lib/
│ ├── agents/ # LangGraph agent
│ ├── api/ # API clients
│ └── utils.ts # Utilities
├── hooks/ # React hooks
├── types/ # TypeScript types
└── providers/ # React providers
MIT