Open-source visual platform for building, testing, and deploying LangChain agents and LangGraph workflows.
LangConfig makes agentic AI accessible. Build LangChain Agents and Deep Agents with full control over their toolsets, prompts, and memory configurations—no coding required.
Drop agents onto a visual canvas and connect them into multi-agent LangGraph workflows. Run workflows and watch agent thinking, tool selection, outputs, and errors in real-time. Test a workflow, review the output, tweak tools or RAG settings, run it again, and compare results—all in one place.
Create custom tools using LangChain's middleware system, or use prebuilt templates for Discord, Slack, and other integrations. Open a chat interface with any agent to collaboratively improve its system prompt, test behavior, and get feedback. Conversation context flows seamlessly into workflow execution with simple on/off controls.
When you're ready to share or deploy, export your workflow as a JSON config that anyone with LangConfig can import instantly. Or download a complete Python package—LangChain/LangGraph code, execution scripts, and a Streamlit web UI—ready to run anywhere.
LangConfig includes workflow templates for research and content creation. We're actively building new features and templates to make it easy to pick up and start experimenting with agentic AI.
- Visual Workflow Builder - Drag-and-drop LangGraph state graphs on an interactive canvas
- Custom Agent Builder - Create specialized agents with AI-generated configurations
- Interactive Chat Testing - Test agents with live streaming, tool execution visibility, and document upload
- RAG Knowledge Base - Upload documents (PDF, DOCX, code) for semantic search with pgvector
- Multi-Model Support - OpenAI (GPT-4o, GPT-5), Anthropic (Claude 4.5 Sonnet/Opus/Haiku), Google (Gemini 3 Pro, Gemini 2.5), DeepSeek, local models (Ollama, LM Studio)
- Custom Tool Builder - Create specialized tools beyond built-in MCP servers
- Real-Time Monitoring - Watch agent execution, tool calls, token usage, and costs live
- Artifact Gallery - View and bulk download generated images and files from workflow executions
- Workflow Scheduling - Automate workflows with cron expressions, timezone support, and concurrency controls
- Event-Driven Triggers - Fire workflows from webhooks (HMAC-SHA256 verified) or file system changes (glob patterns, debounce)
- File Versioning & Diff Viewer - Track file version history with unified and side-by-side diff views
- Presentation Generation - Export workflow artifacts to Google Slides, PDF, or Reveal.js presentations
- Export to Code - Generate standalone Python packages with Streamlit UI, FastAPI server, or raw LangGraph code
- LangGraph Subgraph Streaming - Nested subgraph execution with real-time SSE streaming
- Human-in-the-Loop - Add approval checkpoints for critical decisions - Still Experimental
- Advanced Memory - Short-term (LangGraph checkpoints) and long-term (pgvector + LangGraph Store) persistence
- Local-First - All data stays on your machine
1. Clone Repository
git clone https://github.com/langconfig/langconfig.git
cd langconfig2. Install Frontend Dependencies
npm install3. Run Backend Setup Script
python backend/scripts/setup.pyThis automated script will:
- Check Python 3.11+ and Docker prerequisites
- Create
.envfrom.env.example - Install backend Python dependencies
- Start PostgreSQL via Docker
- Initialize the database and seed agent templates
4. Add Your API Keys
Edit .env and add your API keys:
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=AIza...Terminal 1 - Start Backend:
cd backend
python main.pyBackend runs at: http://127.0.0.1:8765
Terminal 2 - Start Frontend:
npm run devFrontend runs at: http://localhost:1420
Open your browser to http://localhost:1420
Requires Rust (Install)
Windows users: Install Visual Studio Build Tools
# Start backend in Terminal 1
cd backend
python main.py
# Start desktop app in Terminal 2
npm run tauri devThis opens a native desktop window instead of a browser.
langconfig/
├── src/ # React 19 frontend (TypeScript + Tailwind)
│ ├── features/
│ │ ├── workflows/ # Visual canvas & workflow management
│ │ ├── agents/ # Agent builder & library
│ │ ├── chat/ # Interactive chat testing
│ │ ├── knowledge/ # RAG document upload
│ │ ├── memory/ # Memory visualization
│ │ ├── tools/ # Custom tool builder
│ │ └── settings/ # App settings & API keys
│ ├── components/ # Shared UI components
│ ├── contexts/ # React context providers
│ ├── hooks/ # Custom React hooks
│ └── lib/ # API client & utilities
├── backend/ # Python FastAPI backend
│ ├── api/ # REST API routes
│ │ ├── workflows/ # Workflow execution & management
│ │ ├── agents/ # Agent CRUD & templates
│ │ ├── chat/ # Chat sessions & streaming
│ │ ├── knowledge/ # Document upload & RAG
│ │ ├── tools/ # Custom tool management
│ │ ├── schedules/ # Cron-based workflow scheduling
│ │ ├── triggers/ # Event-driven workflow triggers
│ │ ├── webhooks/ # Incoming webhook endpoints
│ │ ├── presentations/ # Presentation generation (Slides, PDF, Reveal.js)
│ │ └── settings/ # API keys & configuration
│ ├── core/
│ │ ├── workflows/ # LangGraph orchestration engine
│ │ ├── agents/ # Agent factory & base classes
│ │ ├── templates/ # Pre-built agent & workflow templates
│ │ ├── tools/ # Native and custom tool integrations
│ │ ├── codegen/ # Python code export generation
│ │ └── middleware/ # LangGraph middleware (RAG, validation)
│ ├── services/
│ │ ├── context_retrieval.py # RAG retrieval with HyDE
│ │ ├── llama_config.py # Vector store (pgvector)
│ │ ├── token_counter.py # Token tracking & cost calculation
│ │ ├── scheduler_service.py # APScheduler cron service
│ │ └── triggers/ # File watcher & trigger services
│ ├── models/ # SQLAlchemy ORM models
│ ├── middleware/ # FastAPI middleware (performance, CORS)
│ ├── db/ # Database initialization
│ │ ├── init_postgres.sql # pgvector setup (auto-run on Docker start)
│ │ └── init_deepagents.py # Seed agent templates
│ └── alembic/ # Database migrations
├── docs/ # Documentation
├── scripts/ # Utility scripts
├── src-tauri/ # Tauri desktop app (optional)
├── docker-compose.yml # PostgreSQL + pgvector setup
└── .env # API keys (create from .env.example)
LangConfig uses a single PostgreSQL database with pgvector for:
- Workflows & Projects - Visual workflow definitions and project organization
- Agents & Templates - Custom agents and pre-built templates
- Chat Sessions - Conversation history and session state
- Vector Storage - Document embeddings for RAG retrieval
- LangGraph Checkpoints - Workflow state persistence (via
langgraph-checkpoint-postgres) - Schedules & Triggers - Cron schedules, webhook triggers, file watchers, and execution logs
- File Versions - Workspace file version history with diffs
Setup Steps:
-
Docker starts PostgreSQL -
docker-compose up -d postgres- Automatically runs
backend/db/init_postgres.sql - Creates
vectorextension (pgvector) - Creates initial
vector_documentstable
- Automatically runs
-
Alembic creates all tables -
alembic upgrade head- Runs migrations in
backend/alembic/versions/ - Creates: workflows, projects, agents, chat_sessions, session_documents, checkpoints, etc.
- Runs migrations in
-
Seed agent templates (optional) -
python db/init_deepagents.pyExperimental- Populates
deep_agent_templatestable with pre-built agents - Adds templates like Research Agent, Code Reviewer, etc.
- Populates
- Click an agent from the library (e.g., "Research Agent")
- Click the Chat icon
- Upload documents for RAG context (optional)
- Send a message:
"Summarize the key findings in these papers" - Watch the agent use tools in real-time
- View token costs and metrics in the sidebar
- Go to Studio → New Workflow
- Drag "Research Agent" to canvas
- Drag "Code Implementer" to canvas
- Connect them: Research → Implementer
- Click Run
- Enter task:
"Research best practices for authentication and implement it" - Research Agent finds information → passes to Implementer → code is generated
- Click Agent Builder from toolbar
- Enter name:
"Security Auditor" - Enter description:
"Reviews code for security vulnerabilities and suggests fixes" - Click AI Generate → GPT-4o suggests:
- Model:
gpt-4o(reasoning capability) - Temperature:
0.2(focused, deterministic) - Tools:
filesystem,grep,web_search - System prompt: Specialized security analysis prompt
- Model:
- Review and customize (add more tools, adjust prompt)
- Click Save → use in workflows or chat testing
- Build workflow visually (e.g., Research → Plan → Implement → Test)
- Click Export → Download Python Package
- Extract the ZIP file to any folder
- Run
pip install -r requirements.txt - Add API keys to
.env - Run
streamlit run streamlit_app.py - Use your workflow as a standalone web app with live streaming output
Copy .env.example to .env and configure:
cp .env.example .envRequired:
| Variable | Description |
|---|---|
DATABASE_URL |
PostgreSQL connection string (default: postgresql://langconfig:langconfig_dev@localhost:5433/langconfig) |
LLM API Keys (at least one required):
| Variable | Description |
|---|---|
OPENAI_API_KEY |
OpenAI API key for GPT models |
ANTHROPIC_API_KEY |
Anthropic API key for Claude models |
GOOGLE_API_KEY |
Google API key for Gemini models |
Optional:
| Variable | Description | Default |
|---|---|---|
DEEPSEEK_API_KEY |
DeepSeek API key | - |
GITHUB_PAT |
GitHub Personal Access Token | - |
GITLAB_PAT |
GitLab Personal Access Token | - |
LOCAL_LLM_HOST |
Local model server URL | http://localhost:11434 |
SECRET_KEY |
App secret key | Auto-generated |
ENVIRONMENT |
development or production |
development |
LOG_LEVEL |
Logging level | INFO |
Workflow Execution:
| Variable | Description | Default |
|---|---|---|
MAX_WORKFLOW_TIMEOUT |
Max workflow runtime (seconds) | 300 |
MAX_CONCURRENT_WORKFLOWS |
Parallel workflow limit | 5 |
MAX_EXECUTION_HISTORY_PER_WORKFLOW |
History entries to keep | 100 |
EXECUTION_HISTORY_RETENTION_DAYS |
Days to retain history | 90 |
API keys can also be configured via Settings UI in the app (stored encrypted in database, takes priority over .env).
Run models locally with zero API costs:
- Install Ollama or LM Studio
- Start local model server (default:
http://localhost:11434) - Go to Settings → API Keys
- Add Local Provider:
- Base URL:
http://localhost:11434/v1 - Model:
llama3.1(or your model name)
- Base URL:
- Use in any agent configuration
Native Python Tools (no external dependencies):
web_search- Web search via DuckDuckGo (free, no API key)web_fetch- Fetch webpage contentfile_read/file_write/file_list- File system operationsmemory_store/memory_recall- Long-term memory (PostgreSQL-backed)reasoning_chain- Break down complex tasks into logical steps
Browser Automation (Playwright, requires playwright install chromium):
browser_navigate- Navigate URLs with JavaScript renderingbrowser_click- Click elements on pagebrowser_extract- Extract text/links from pagesbrowser_screenshot- Capture page screenshots
Custom Tool Templates (create via UI):
- Notifications: Slack, Discord (multi-channel webhooks)
- CMS/Publishing: WordPress REST API, Twitter/X API
- Image/Video: DALL-E 3, ChatGPT Image Gen 1.5, Sora, Imagen 3, Nano Banana (Gemini 2.5 Flash Image), Veo 3.1 Fast
- Database: PostgreSQL, MySQL, MongoDB queries
- API/Webhook: Custom REST API calls with auth
- Data Transform: JSON ↔ CSV ↔ XML ↔ YAML conversion
Frontend:
- React 19.2 + TypeScript 5.8
- Tailwind CSS 4.1
- ReactFlow 11.11 (visual canvas)
- TanStack Query 5.90
- Tauri 2.0 (optional desktop app)
Backend:
- Python 3.11+
- FastAPI 0.115
- LangChain v1.0 (full ecosystem)
- LangGraph 0.4+ (with checkpoint-postgres)
- LlamaIndex (document indexing & RAG)
Database:
- PostgreSQL 16 with pgvector
- SQLAlchemy 2.0 + Alembic (migrations)
- langgraph-checkpoint-postgres (state persistence)
AI/ML:
- OpenAI (GPT-4o, GPT-4o-mini, GPT-5, o3, o3-mini, o4-mini)
- Anthropic (Claude 4.5 Sonnet, Claude 4.5 Opus, Claude 4.5 Haiku)
- Google (Gemini 3 Pro Preview, Gemini 2.5 Flash, Gemini 2.0 Flash)
- DeepSeek (DeepSeek Chat, DeepSeek Reasoner)
- Local models via Ollama/LM Studio
- Sentence Transformers (embeddings)
- Unstructured (document processing)
LangConfig supports automated workflow execution through cron schedules, webhooks, and file system triggers.
Schedule workflows to run automatically on a recurring basis:
- Open a workflow → Settings → Schedule
- Enter a cron expression (e.g.,
0 9 * * 1-5for weekdays at 9 AM) - Select a timezone and configure optional input data
- Enable the schedule
Schedules support concurrency limits, idempotency keys for deduplication, and a full execution history log.
Trigger workflows from external services via HTTP:
- Open a workflow → Settings → Triggers → Add Webhook
- Copy the generated webhook URL and secret
- Configure your external service to POST to the URL
- Payloads are verified with HMAC-SHA256 signatures and optional IP whitelisting
Use input mapping to transform incoming payloads into workflow input.
Trigger workflows when files change on disk:
- Open a workflow → Settings → Triggers → Add File Watch
- Set a directory path and glob pattern (e.g.,
*.csv) - Choose events to watch: created, modified, deleted, or moved
- Configure debounce interval to prevent rapid re-triggers
File watchers support recursive directory monitoring.
# Windows
taskkill /F /IM node.exe
# macOS/Linux
lsof -ti:1420 | xargs kill -9# Check Docker is running
docker-compose ps
# Restart PostgreSQL
docker-compose restart postgres
# Check logs
docker-compose logs postgres# Reset migrations (WARNING: deletes all data)
cd backend
alembic downgrade base
alembic upgrade head# Reinstall all dependencies
cd backend
pip install --upgrade pip
pip install -r requirements.txtPrerequisites:
- Rust installed (Install)
- Visual Studio Build Tools (Windows only)
npm run tauri buildGenerates platform-specific installers:
- Windows:
.exe,.msi - macOS:
.app,.dmg - Linux:
.AppImage,.deb
Total size: ~250MB (includes Python runtime and dependencies)
# Backend tests
cd backend
pytest
# Frontend tests
npm testcd backend
# Create new migration
alembic revision --autogenerate -m "Description of changes"
# Apply migration
alembic upgrade head
# Rollback migration
alembic downgrade -1Agent templates are defined in backend/core/agents/templates.py. Workflow recipes (multi-node templates) are in backend/core/templates/workflow_recipes.py.
To add new templates:
- Add your template definition to the appropriate file
- Templates are auto-registered on backend startup
- For database-stored agents, use the Agent Builder UI or run:
cd backend
python db/init_deepagents.py- Chat API Documentation - Interactive chat testing API
- GitHub Issues - Report bugs and request features
We welcome contributions! Whether you're:
- Adding agent templates
- Improving UI/UX
- Writing documentation
- Reporting bugs
- Suggesting features
How to Contribute:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and add tests
- Commit:
git commit -m 'Add amazing feature' - Push:
git push origin feature/amazing-feature - Open a Pull Request
See CONTRIBUTING.md for detailed guidelines.
Copyright 2025 LangConfig Contributors
Licensed under the MIT License. See LICENSE file for details.
- LangChain & LangGraph - MIT License
- FastAPI - MIT License
- React - MIT License
- Tauri - Apache 2.0 / MIT License
- PostgreSQL - PostgreSQL License
- GitHub Issues: Report bugs and request features
- Discussions: Ask questions and share ideas
LangConfig - Visual AI Agent Workflows Powered by LangChain & LangGraph