Ollama Flutter GUI is a modern, responsive web application that leverages the power of Ollama's offline language models to provide an interactive chat experience. Built with Flutter, this application offers a sleek, material design-inspired interface for interacting with various AI models locally.
- Responsive Web Design: Optimized for various screen sizes while maintaining a maximum width for better readability.
- Multiple Model Support: Easily switch between different Ollama models by modifying a single line of code.
- File Upload: Ability to upload and process multiple files within the chat interface.
- Modern UI: Utilizes Material Design 3 for a fresh, contemporary look.
- Offline Functionality: Leverages Ollama's offline models for privacy and speed.
- Real-time Responses: Instant AI-generated responses to user inputs.
- Frontend: Flutter Web
- State Management: Riverpod
- Backend Integration: HTTP requests to local Ollama API
- File Handling: file_picker package for multi-file selection
- Flutter SDK (latest stable version)
- Dart SDK (latest stable version)
- Ollama installed and running locally
- A modern web browser (Chrome, Firefox, Safari, or Edge)
-
Clone the Repository
git clone https://github.com/gabrimatic/ollama_flutter_gui.git cd ollama_flutter_gui -
Install Dependencies
flutter pub get -
Configure Ollama Ensure Ollama is installed and running on your local machine. The default API endpoint is set to
http://localhost:11434/api/generate. Modify this inchat_state.dartif your setup differs. -
Run the Application
flutter run -d chrome
-
Starting a Chat: Type your message in the input field at the bottom of the screen and press the send button or hit enter.
-
Uploading Files: Click the attachment icon to select one or more files. The file names will be sent to the AI model for processing.
-
Changing AI Models: To use a different Ollama model, modify the
modelfield in the POST request body inchat_state.dart. For example, change'model': 'llama3.1'to'model': 'gpt4'or any other model you have installed in Ollama. -
Viewing Responses: AI-generated responses will appear in the chat interface, distinguished from user messages by color and alignment.
lib/main.dart: Entry point of the applicationchat_screen.dart: Main chat interfacechat_state.dart: State management and API interactionschat_message.dart: Chat message model and widget
- Colors: Modify the color scheme in
main.dartby changing theseedColorinColorScheme.fromSeed(). - API Endpoint: Update the API URL in
chat_state.dartif you're using a different address for Ollama. - Max Width: Adjust the
maxWidthconstraint inchat_screen.dartto change the application's maximum width. - AI Model: Change the model in
chat_state.dartby modifying themodelfield in the POST request body.
Contributions to Ollama Flutter GUI are welcome! Please feel free to submit pull requests, create issues or spread the word.
MIT License
- Ollama team for providing powerful offline language models
- Flutter and Dart teams for the excellent framework and language
- Contributors and users of this project
For more information on Flutter development, please refer to the Flutter documentation.
For details on Ollama and its models, visit the Ollama official website.
© All rights reserved.

