Skip to content

IhateCreatingUserNames2/AiraHUB3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 

Repository files navigation

AiraHUB: An Intelligent Orchestration Engine for AI Agents

AiraHUB Logo

AiraHUB is a sophisticated, decentralized registry and orchestration engine designed to manage and coordinate a network of independent AI agents. It goes beyond a simple service directory by incorporating intelligent planning, a persistent learning mechanism (ReasoningBank), and standardized communication protocols (MCP and A2A) to enable complex, multi-agent task execution.

Python 3.9+ License: MIT

Core Features

  • Agent Registry & Discovery: Agents can register their capabilities (tools and skills), send heartbeats to indicate they are online, and be discovered by other services based on their metadata, tags, or supported skills.
  • Intelligent Orchestration: Takes a high-level goal in natural language (e.g., "Analyze last quarter's sales data and generate a summary presentation") and automatically creates and executes a multi-step plan using the available tools from the agent network.
  • ReasoningBank - A Learning System: Agents can contribute their task experiences (both successes and failures). AiraHUB distills these experiences into generalized, reusable reasoning patterns, allowing the orchestrator and other agents to learn from the collective's past activities.
  • Standardized Communication:
    • MCP (Model Context Protocol): Implements a JSON-RPC 2.0 based streamable HTTP endpoint for robust, bi-directional tool calls between the hub and agents.
    • A2A (Agent-to-Agent): Facilitates direct, task-based communication between registered agents, brokered by the hub.
  • Pluggable Architecture:
    • Flexible Storage: Built on an abstract storage layer, defaulting to a simple, file-based SQLite database (no external DB required!), but can be extended to support other databases.
    • Flexible AI Backends: Uses LiteLLM to support a wide range of LLMs (OpenAI, Anthropic, Google, OpenRouter, etc.) for planning and reasoning. Supports both API-based and local (sentence-transformers) models for generating vector embeddings.
  • Universal Adapter: Provides a simplified MCP facade, exposing the entire power of the orchestration engine as a single, powerful tool for external clients (like large language models).

Architectural Overview

AiraHUB is built with a modular, service-oriented architecture using FastAPI.

+------------------+      +-------------------------------------------------------+      +------------------+
|                  |      |                       AiraHUB                         |      |                  |
|  External Client |----->|  Adapter (Simplified MCP Facade)                      |      |   Agent Alpha    |
| (e.g., LLM UI)   |      |                                                       |      | (Weather Tool)   |
+------------------+      |-------------------------------------------------------|      +--------^---------+
                        |                 Orchestration Service                 |               |
                        | (Receives Goal -> Creates Plan -> Executes Plan)      |               | MCP
                        |             |                           ^             |               |
                        |             v                           |             |               v
                        |  +-------------------+   +----------------------+     |      +------------------+
                        |  | Reasoning Service |   |     Agent Service    |     |      |                  |
                        |  | (ReasoningBank)   |   | (Tool & Agent Cache) |     |----->|   Agent Beta     |
                        |  +-------------------+   +----------------------+     |      | (Database Tool)  |
                        |             ^                           ^             |      +------------------+
                        |             |                           |             |
                        |  +---------------------------------------------------+  |
                        |  |       Abstract Storage Layer (Default: SQLite)    |  |
                        |  +---------------------------------------------------+  |
                        +-------------------------------------------------------+

Getting Started

Prerequisites

  • Python 3.9 or higher
  • An API key for an LLM provider supported by LiteLLM (e.g., OpenAI, Anthropic via OpenRouter).

1. Installation

  1. Clone the repository:

    git clone https://github.com/IhateCreatingUserNames2/AiraHUB3/airahub.git
    cd airahub
  2. Create and activate a virtual environment:

    python -m venv .venv
    # On Windows
    # .\.venv\Scripts\activate
    # On macOS/Linux
    source .venv/bin/activate
  3. Install the dependencies: A requirements.txt file for this project would look like this. Create it and then run the install command.

    requirements.txt:

    fastapi
    uvicorn[standard]
    pydantic
    python-dotenv
    httpx
    aiosqlite
    litellm
    sentence-transformers
    numpy
    

    Install command:

    pip install -r requirements.txt

2. Configuration

AiraHUB is configured using environment variables. Create a .env file in the root directory by copying the example below.

.env file:

# --- LLM Configuration (uses LiteLLM) ---
# Set the model for planning and reasoning distillation. OpenRouter is a good choice.
LITELLM_MODEL_INFERENCE="openrouter/anthropic/claude-3-haiku"

# Provide your API key for the service you're using.
# For OpenRouter, get it from https://openrouter.ai/keys
OPENROUTER_API_KEY="sk-or-v1-..."

# --- Embedding Model Configuration ---
# Provider can be "sentence_transformers" (local, free) or "litellm" (API-based).
# If you have a GPU or a decent CPU, sentence_transformers is recommended.
EMBEDDING_PROVIDER="sentence_transformers"

# If using sentence_transformers, specify the model name. all-MiniLM-L6-v2 is a good lightweight default.
# If using litellm, specify a model like "text-embedding-ada-002" and provide the corresponding API key (e.g., OPENAI_API_KEY).
LITELLM_MODEL_EMBEDDING="all-MiniLM-L6-v2"

# --- Server Configuration ---
HOST="0.0.0.0"
PORT="8017"
DEBUG="false" # Set to "true" for more verbose logging and auto-reload

3. Running the Application

Once your .env file is configured, start the server using Uvicorn:

uvicorn main:app --host 0.0.0.0 --port 8017 --reload
  • --reload enables auto-reloading for development, which is useful if you set DEBUG="true".

You should see log output indicating that the server has started and connected to the SQLite database.

INFO:     Started server process [12345]
INFO:     Waiting for application startup.
INFO:     AIRA Hub iniciando o ciclo de vida (startup)...
INFO:     SQLiteStorage inicializado com o arquivo de banco de dados: airahub.db
INFO:     Tabelas do SQLite garantidas (agents, reasoning_items, a2a_tasks).
INFO:     Armazenamento inicializado com sucesso usando SQLite em 'airahub.db'.
...
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8017 (Press CTRL+C to quit)

You can now access the interactive API documentation at http://localhost:8017/docs.

API Overview

AiraHUB exposes a set of powerful, modular endpoints:

Example Workflow: Goal Orchestration

This example demonstrates the full power of AiraHUB.

  1. Agent Registration: Two agents start up and register themselves with the Hub by POSTing to /register.

    • Agent-Search registers a tool named find_product(query: str).
    • Agent-Analysis registers a tool named summarize_data(data: dict).
  2. User Request: A user sends a high-level goal to AiraHUB's orchestration endpoint.

    • POST /orchestrate
    • Body: { "goal": "Find the new 'Quantum X1' laptop and summarize its specifications." }
  3. Planning Phase:

    • The Orchestration Service receives the goal.
    • It queries the Reasoning Service for similar past tasks, retrieving strategies like "For product searches, always use the find_product tool first."
    • It queries the Agent Service for all available tools.
    • It constructs a detailed prompt containing the goal, retrieved reasoning, and the list of tools, and sends it to the configured LLM.
    • The LLM returns a structured, multi-step plan:
      [
        { "step": 1, "tool_name": "find_product", "arguments": { "query": "Quantum X1 laptop" }, "reasoning": "..." },
        { "step": 2, "tool_name": "summarize_data", "arguments": { "data": "{result_from_step_1}" }, "reasoning": "..." }
      ]
  4. Execution Phase:

    • The Orchestrator executes Step 1: It finds Agent-Search via the Agent Service and calls its find_product tool using the MCP protocol.
    • Agent-Search returns the product specifications as JSON.
    • The Orchestrator executes Step 2: It calls Agent-Analysis's summarize_data tool, passing the result from the first step.
    • Agent-Analysis returns a concise text summary.
  5. Response:

    • AiraHUB returns the final summary to the user, along with the full execution log and the plan that was created.

Future Improvements

Module 5: The Agent Marketplace (The Economic Platform)

  • Vision: To transform AiraHub from a technical infrastructure into a vibrant economic ecosystem. This module allows third-party developers to not only register but also host, manage, and monetize their agents directly on the AiraHub platform. AiraHub becomes something like an APP Store for AI agents.

Module 6: Collaboration & Social Features (The Community Platform)

  • Vision: To evolve AiraHub beyond a transactional request/response system into a platform for real-time, interactive collaboration between humans and agents, and among humans through agents. This module turns AiraHub into the "Twitch" or "Figma" for AI agents.

Contributing

Contributions are welcome! Please feel free to open an issue to report a bug or suggest a feature, or submit a pull request with your improvements.

License

This project is licensed under the MIT License. See the LICENSE file for details.

About

AiraHUB: The Internet of AI Agents

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages