Skip to content

feat(gemini): add Google Gemini provider #1592

@bug-ops

Description

@bug-ops

Summary

Add Google Gemini as a first-class LLM provider in zeph-llm, supporting chat, streaming, tool use, vision, and embeddings.

Motivation

Google Gemini offers competitive models (2.5-pro, 2.5-flash, 2.0-flash) with 1M token context windows, native function calling, vision, and thinking capabilities. Adding Gemini as a provider expands Zeph's multi-provider coverage and enables orchestrator/router configurations that include Google's model family.

API Reference

Key API Characteristics

  • Base URL: https://generativelanguage.googleapis.com/v1beta
  • Auth: API key via ?key= query parameter
  • Message format: contents[].parts[] (not OpenAI-compatible)
  • System prompt: separate systemInstruction field
  • Role names: user / model (not assistant)
  • Streaming: ?alt=sse query param (SSE)
  • Tool format: functionDeclarations / functionCall / functionResponse
  • Embeddings: embedContent endpoint with task type parameter

Implementation Phases

Config Format

[llm]
provider = "gemini"

[llm.gemini]
model = "gemini-2.5-flash"
max_output_tokens = 8192
embedding_model = "text-embedding-004"

Vault key: ZEPH_GEMINI_API_KEY

Detailed Plan

See .local/plan/gemini-provider.md

Metadata

Metadata

Assignees

No one assigned

    Labels

    epicMilestone-level tracking issue

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions