TypeScriptADK-TS

Models & Providers

Configure LLM models from Gemini, OpenAI, Anthropic, OpenRouter, and other providers with ADK-TS agents

ADK-TS provides flexible model integration, allowing you to use various Large Language Models (LLMs) with your agents. The framework defaults to Google Gemini models but supports extensive customization through multiple approaches.

Model Integration Options

ADK-TS supports multiple ways to configure models:

🎯 Option 1: Direct Model Names

Pass model names directly to agents. Gemini is default, others require environment configuration

🔌 Option 2: Vercel AI SDK

Use model instances from Vercel AI SDK for extensive provider support

🌐 Option 3: OpenRouter

Access 200+ models from multiple providers through a single unified API

Option 1: Direct Model Names

The simplest approach - pass model names as strings directly to your agents. ADK-TS defaults to Gemini models but supports other providers when properly configured.

Default: Google Gemini Models (Easiest Setup)

For Gemini models (default), you only need to set the API key:

# .env file
GOOGLE_API_KEY=your_google_api_key_here

That's it! You can now use Gemini models with agents. The framework defaults to gemini-2.0-flash:

import { LlmAgent } from "@iqai/adk";

// Uses default Gemini model (gemini-2.0-flash)
const agent = new LlmAgent({
  name: "my_agent",
  description: "An agent using the default Gemini model",
  instruction: "You are a helpful assistant",
});

// Use a different Gemini model
const advancedAgent = new LlmAgent({
  name: "advanced_agent",
  description: "Using a more powerful Gemini model",
  model: "gemini-2.5-pro", // Just pass the model name
  instruction: "You are an expert analyst",
});

export { agent, advancedAgent };

Using Other Providers or Different Gemini Models

To use non-Gemini models or change the default Gemini model, you must configure both the model name and API key:

1. Set both the model name and corresponding API key in your .env file:

# .env file
# For OpenAI:
LLM_MODEL=gpt-4o
OPENAI_API_KEY=your_openai_api_key_here

# Or for Claude:
LLM_MODEL=claude-sonnet-4-5-20250929
ANTHROPIC_API_KEY=your_anthropic_api_key_here

# Or for Groq:
LLM_MODEL=llama-3.3-70b-versatile
GROQ_API_KEY=your_groq_api_key_here

# Or for a different Gemini model:
LLM_MODEL=gemini-2.5-pro
GOOGLE_API_KEY=your_google_api_key_here

2. Use the model in your agents:

import { LlmAgent } from "@iqai/adk";

const { LLM_MODEL } = process.env;

// Using environment-configured model
const agent = new LlmAgent({
  name: "my_agent",
  model: LLM_MODEL, // Will use whatever is set in .env
  instruction: "You are a helpful assistant",
});

// Or directly specify a model name
const openAiAgent = new LlmAgent({
  name: "openai_agent",
  model: "gpt-4o", // Direct model name
  instruction: "You are an expert assistant",
});

export { agent, openAiAgent };

How It Works

The framework is smart enough to automatically detect which LLM provider to use based on the model name you pass. Just set the API key for that provider and pass the model name - the framework handles the rest!

  • Default Gemini: Only need GOOGLE_API_KEY (framework defaults to gemini-2.0-flash)
  • Different model: Set the corresponding API key and pass the model name (e.g., model: "gpt-4o", model: "claude-3-5-sonnet-20241022")
  • Provider detection: Framework automatically recognizes OpenAI, Claude, Groq, and other providers from the model name

Option 2: Vercel AI SDK Integration

For more control and advanced features, use model instances from the Vercel AI SDK. This approach provides access to multiple providers with consistent APIs and advanced capabilities.

Setup Requirements

1. Install Provider Packages:

# Install the providers you want to use
npm install @ai-sdk/openai      # For OpenAI models
npm install @ai-sdk/anthropic   # For Anthropic models
npm install @ai-sdk/mistral     # For Mistral models

2. Configure API Keys:

# .env file
OPENAI_API_KEY=your_openai_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here
MISTRAL_API_KEY=your_mistral_api_key_here

3. Use Model Instances:

import { LlmAgent } from "@iqai/adk";
import { openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { mistral } from "@ai-sdk/mistral";

// OpenAI models
const gpt4Agent = new LlmAgent({
  name: "gpt4_agent",
  description: "GPT-4 powered assistant",
  model: openai("gpt-4o"),
  instruction: "You are a helpful assistant",
});

// Anthropic models
const claudeAgent = new LlmAgent({
  name: "claude_agent",
  description: "Claude powered assistant",
  model: anthropic("claude-3-5-sonnet-20241022"),
  instruction: "You are a helpful assistant",
});

// Mistral models
const mistralAgent = new LlmAgent({
  name: "mistral_agent",
  description: "Mistral powered assistant",
  model: mistral("mistral-large-latest"),
  instruction: "You are a helpful assistant",
});

Supported Providers

🤖 OpenAI

GPT-4o, GPT-4, GPT-3.5, and latest ChatGPT models

🧠 Anthropic

Claude 3.5 Sonnet, Claude 3 Opus, and Haiku models

🔥 Mistral

Mistral Large, Codestral, and specialized models

⚡ Groq

Ultra-fast inference for Llama, Mixtral, and Gemma models

🌐 Many Others

Google, Perplexity, Cohere, and other providers

The Vercel AI SDK supports many more providers beyond what's shown here. Check the official documentation for the complete list of supported providers and models.

Local & Open Source Models

Local and open source models (like Ollama, self-hosted models) are also supported through the Vercel AI SDK approach. Install the appropriate provider package (@ai-sdk/ollama, etc.) and configure as needed. Note that not all local models support function calling reliably.

Option 3: OpenRouter Integration

OpenRouter provides unified access to 200+ models from multiple providers through a single API. This is ideal when you want to switch between different models without managing multiple API keys or want access to models with special capabilities like web search.

Why Use OpenRouter?

  • Single API Key: Access models from OpenAI, Anthropic, Google, Meta, and more with one key
  • Cost Optimization: Compare pricing and switch between models easily
  • Specialized Models: Access models with unique capabilities (e.g., Perplexity's web-search enabled models)
  • Simplified Billing: One invoice for all your model usage across providers

Setup Requirements

1. Install OpenRouter Provider:

npm install @openrouter/ai-sdk-provider

2. Configure Environment Variables:

# .env file
OPENROUTER_API_KEY=your_openrouter_api_key_here
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1  # Optional, this is the default
LLM_MODEL=google/gemini-2.5-flash                  # Model to use (with provider prefix)

Get Your OpenRouter API Key

Sign up at openrouter.ai to get your API key. You'll get free credits to start, and pay-as-you-go pricing afterwards.

3. Create OpenRouter Helper:

// lib/helpers/open-router.ts
import { createOpenRouter } from "@openrouter/ai-sdk-provider";

export const openrouter = createOpenRouter({
  apiKey: process.env.OPENROUTER_API_KEY,
  baseURL: process.env.OPENROUTER_BASE_URL,
});

4. Use with Agents:

import { LlmAgent } from "@iqai/adk";
import { openrouter } from "./lib/helpers/open-router";

// Using environment-configured model
const agent = new LlmAgent({
  name: "my_agent",
  description: "An agent using OpenRouter",
  model: openrouter(process.env.LLM_MODEL), // e.g., "google/gemini-2.5-flash"
  instruction: "You are a helpful assistant",
});

// Using a specific model directly
const claudeAgent = new LlmAgent({
  name: "claude_agent",
  description: "Claude via OpenRouter",
  model: openrouter("anthropic/claude-sonnet-4-5-20250929"),
  instruction: "You are an expert analyst",
});

// Using Perplexity's online model with web search
const researchAgent = new LlmAgent({
  name: "research_agent",
  description: "Research agent with web search capabilities",
  model: openrouter("perplexity/sonar-pro"),
  instruction:
    "You are a research assistant with access to current web information",
});
ProviderModel NameCapabilities
Googlegoogle/gemini-2.5-flashFast, versatile, cost-effective
Googlegoogle/gemini-2.5-proAdvanced reasoning, multimodal
OpenAIopenai/gpt-4oPowerful general-purpose model
Anthropicanthropic/claude-sonnet-4-5-20250929Excellent for analysis and coding
Metameta-llama/llama-3.3-70b-instructOpen source, strong performance
Perplexityperplexity/sonar-proWeb search enabled, current info

Model Naming Convention

OpenRouter models use the format provider/model-name. Check the OpenRouter models page for the complete list and current pricing.

Which Option Should You Choose?

Use CaseRecommended OptionWhy
Getting StartedOption 1 (Gemini default)Simple setup, just need GOOGLE_API_KEY
Production AppsOption 1 with env configSimple, reliable, fewer dependencies
Multi-ProviderOption 3 (OpenRouter)Single API key, unified access to 200+ models
Cost OptimizationOption 3 (OpenRouter)Easy model comparison and switching
Web Search / Online LLMsOption 3 (OpenRouter)Access to Perplexity and other online models
Advanced FeaturesOption 2 (Vercel AI SDK)Streaming, advanced config, type safety
Local/Private ModelsOption 2 (Vercel AI SDK)Only option that supports local deployment

Combining Approaches

You can mix and match these options in the same application! Use OpenRouter for most agents, Vercel AI SDK for local models, and direct model names for simple Gemini usage.

Next Steps