cascadeflow integrates with LangChain through a callback handler that wraps anyDocumentation Index
Fetch the complete documentation index at: https://docs.cascadeflow.ai/llms.txt
Use this file to discover all available pages before exploring further.
BaseChatModel. It keeps the product direction intact inside LangChain and LangGraph: decisions happen inside agent execution, with budgets, traces, and runtime policy visible where the workflow actually runs.
Install
Quick Start
Features
- Full LCEL support (pipes, sequences, batch)
- Streaming with pre-routing
- Tool calling and structured output
- LangSmith cost tracking metadata
- Cost tracking callbacks
- Domain policies with
cascadeflow_domainmetadata
Why This Integration Matters
- Keeps LangChain apps framework-native instead of forcing a proxy hop
- Makes runtime cost, latency, and trace data visible at the chain or agent level
- Lets teams move from observability to governance without rewriting chain logic
Cost Tracking Callback
LangSmith Integration
When LangSmith tracing is enabled, cascadeflow adds metadata to runs:cascade_decision: whether the drafter was acceptedmodelUsed: which model produced the final responsedrafterQuality: quality score from validationsavingsPercentage: cost savings achieved