A conversational AI support assistant for gaming support, capable of answering questions about game mechanics, player stats, clan information, and more.
- Multi-agent architecture for handling different types of questions
- Combined data & knowledge retrieval for comprehensive answers
- Follow-up capabilities for ambiguous queries
- Feedback system to collect user input on answer quality
- Sample questions for quick testing
- LangSmith integration for observability and debugging
- Backend: FastAPI server with AI agents
- Frontend: Streamlit chat interface
- Database: SQLite with gaming support data
- Observability: LangSmith tracing and evaluation
-
Clone the repository
git clone https://github.com/yourusername/kgen-support-assistant.git cd kgen-support-assistant -
Set up a virtual environment
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate -
Install dependencies
pip install -r requirements.txt -
Set up environment variables
-
Create a
.envfile in the root directory -
Add the following variables:
OPENAI_API_KEY=your_openai_api_key DATABASE_URL=sqlite:///kgen_gaming_support_advanced.db # Optional LangSmith configuration for observability LANGCHAIN_API_KEY=your_langsmith_api_key LANGCHAIN_PROJECT=gaming-support-assistant LANGCHAIN_TRACING_V2=true
-
-
Run the backend server
uvicorn app.main:app --reload --port 8000 -
Run the frontend
cd frontend streamlit run streamlit_app.py -
Access the application
- Backend API: http://localhost:8000
- Frontend: http://localhost:8501
- LangSmith Dashboard: https://smith.langchain.com/
- Sign up for Render (https://render.com)
- Create a new Web Service
- Connect your GitHub repository
- Set build command:
pip install -r requirements.txt - Set start command:
uvicorn app.main:app --host 0.0.0.0 --port $PORT - Add environment variables (OPENAI_API_KEY, etc.)
- Sign up for Streamlit Cloud (https://streamlit.io/cloud)
- Deploy your app
- Connect your GitHub repository
- Set the main file path:
frontend/streamlit_app.py - Set required secrets (API_URL, DEMO_TOKEN)
- Development: SQLite (included)
- Production: Consider migrating to PostgreSQL for better performance and scalability
Once the backend is running, visit:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
The application includes a feedback system that allows users to:
- Rate responses with thumbs up/down
- Provide detailed feedback on negative ratings
- Track feedback in JSON files (stored in the
feedbackdirectory)
This feedback can be used to improve the AI responses over time.
The application uses different settings for development and production:
- Development: Local server with debug mode
- Production: Remote servers with optimized settings
To switch between environments, set the ENVIRONMENT variable to either development or production in your .env file.
The application integrates with LangSmith for enhanced observability, debugging, and evaluation of AI conversations. See docs/langsmith.md for detailed instructions on:
- Setting up LangSmith for your development environment
- Using traces to debug conversation flows
- Monitoring LLM performance metrics
- Evaluating and improving agent responses
LangSmith helps identify issues with:
- Query classification
- Conversation context handling
- Support ticket creation processes
- End-to-end conversation flows
Built with:
- OpenAI's GPT models
- FastAPI
- Streamlit
- LangChain
- LangSmith