Does tavily support python openai function calling for llm?

do you have to write a JSON schema yourself in tools and pass it to llm?
the tavily search function has so many parameters and it takes a lot of effort to write the schema.
as far as i know if you import tavily to langchain, you dont need to do this.

Hi there,

Thanks for reaching out.

You don’t have to manually write the entire JSON schema for every Tavily function. We’ve provided ready-to-use schemas for search, extract, map, and crawl so you can just copy-paste them into your workflow. This saves time and lets you expose only the parameters you care about.
Two easy ways to do this:

  • Copy from our docs – just grab the ready-made schema from the Tavily docs and drop it into your tools definition.(Note: The schemas we provide are designed for the OpenAI Responses API, but we’ve also added guidance in the docs on how to adapt them for the Chat Completions API.)

  • Use the OpenAI Agents SDK – skip schema definitions entirely by wrapping tavily.search() with @function_tool, and the SDK handles the JSON schema automatically (we’ve added an example for the Agents SDK in the docs as well)

Please refer to our docs: https://docs.tavily.com/documentation/integrations/openai

Best,
May