Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: list structured output support on openrouter.ai/api/v1/models #20

Open
simonw opened this issue Mar 10, 2025 · 2 comments

Comments

@simonw
Copy link

simonw commented Mar 10, 2025

I just added schema support to my CLI tool for accessing models:

I don't have a good way to tell which models support schemas programmatically, so I had to warn people in the README that some models might not work: https://github.com/simonw/llm-openrouter/blob/0.4/README.md#schemas

If this API endpoint https://openrouter.ai/api/v1/models included an indication as to if the model supports structured output support or not I could use that information in my own tool.

It currently looks like this:

    {
      "id": "google/gemini-2.0-flash-lite-001",
      "name": "Google: Gemini 2.0 Flash Lite",
      "created": 1740506212,
      "description": "Gemini 2.0 Flash Lite offers a significantly faster time to first token (TTFT) compared to [Gemini Flash 1.5](/google/gemini-flash-1.5), while maintaining quality on par with larger models like [Gemini Pro 1.5](/google/gemini-pro-1.5), all at extremely economical token prices.",
      "context_length": 1048576,
      "architecture": {
        "modality": "text+image->text",
        "tokenizer": "Gemini",
        "instruct_type": null
      },
      "pricing": {
        "prompt": "0.000000075",
        "completion": "0.0000003",
        "image": "0",
        "request": "0",
        "input_cache_read": "0",
        "input_cache_write": "0",
        "web_search": "0",
        "internal_reasoning": "0"
      },
      "top_provider": {
        "context_length": 1048576,
        "max_completion_tokens": 8192,
        "is_moderated": false
      },
      "per_request_limits": null
    }
@yogasanas
Copy link

yogasanas commented Mar 10, 2025

This isn't well documented however you can use the supported_parameters query to filter for models that support json_schema. Example: http://openrouter.ai/api/v1/models?supported_parameters=structured_outputs

However do note that even for models that do support structured outputs some providers don't have support, and we might fall back to json_object for them.

To force OpenRouter to only route to providers that support json_schema you also have to set require_parameters=true in the providers object. Documentation

Let me know if this solves your use case.

@simonw
Copy link
Author

simonw commented Mar 10, 2025

Thank you! That's really helpful.

It would be great if that supported_parameters information was available in the JSON list of models too. As it stands I'm going to have to fetch and cache two JSON files - this one https://openrouter.ai/api/v1/models and also this one https://openrouter.ai/api/v1/models?supported_parameters=structured_outputs - then compare the two at runtime to decide which models support which features.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants