A reusable Python framework for building Telegram chat-bots powered by large language models (LLMs) via the OpenRouter API.
Features:
- Text & image handling – images processed with vision models.
- Voice transcription (
⚠️ WIP - currently disabled due to API compatibility issues). - Conversation persistence in Redis (keyed by bot name / user id / conversation id).
- User whitelist stored in Redis.
- Fully environment-driven configuration via
.env. - Simple async architecture using
python-telegram-botv21.
- Clone this repo and create a Python 3.11 virtualenv:
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt- Create
.envby copyingenv_templateand filling in values:
cp env_template .env
# edit .env with your favourite editorMinimal required variables:
BOT_NAME=monty
TELEGRAM_BOT_TOKEN=<your telegram token>
OPENROUTER_API_KEY=<your openrouter key>
OPENROUTER_LLM=openai/gpt-4o
If the chosen model supports temperature, leave OPENROUTER_LLM_TEMPERATURE_SUPPORTED=true; otherwise set it to false.
-
Set up Redis - The bot requires a Redis instance for conversation storage and user whitelisting.
Option A: Local Redis
# Install and run Redis locally # macOS: brew install redis && brew services start redis # Linux: sudo apt-get install redis-server && sudo systemctl start redis
Option B: Redis Cloud (Free Tier)
- Sign up at Redis Cloud
- Create a free database
- Get your connection details (host, port, password)
Update
REDIS_HOST,REDIS_PORT, andREDIS_PASSWORDin.env. -
Whitelist yourself:
First, message your bot on Telegram. The bot will respond with:
Sorry, you are not authorised to use this bot. (user_id=123456789)Then add your user ID to Redis using
redis-cli:SET monty.123456789 true EX 31536000 # Replace 'monty' with your BOT_NAME, use your actual user_id
Or use a Redis GUI client to create the key manually.
-
Run the bot:
python -m bot.mainThe bot should start and greet you when you /start it in Telegram.
- Handlers in
bot/handlers.pyroute/start,/help, text, and photo messages. (Voice handler currently disabled - WIP) - Conversation objects in
bot/session.pypersist message history (system,user,assistant) in JSON arrays with a TTL (HISTORY_TTL_SECONDS). A fresh conversation id is generated on every/start(timestamp-based). - LLM calls happen in
bot/llm.pyvia the OpenAI SDK, pointed at the OpenRouter endpoint. Temperature is only sent when the model supports it. - Settings are loaded once at startup from
.envviabot/config.py.
The framework is designed so students can fork and redeploy easily:
- Create a new Telegram bot with
@BotFatherand grab the token. - Obtain an OpenRouter key (or switch
OPENROUTER_LLMto an OpenAI model & key). - Provision a Redis instance (e.g. Redis Cloud free tier).
- Set the required env vars, push to your own repo, and deploy on a server / fly.io / Render / etc.
- Add new commands by creating functions in
bot/handlers.pyand registering them inbot/main.py. - Swap out the LLM by changing
OPENROUTER_LLM(and turning off temperature if unsupported). - Replace Redis with another datastore by implementing the small API in
bot/redis_store.py.
PRs welcome!