Skip to main content
MCP Native · Linux Foundation Project

Open Context Engine
for AI Agents

From raw operational data to actionable context. The missing layer that tells AI how your systems actually work.

AI APPLICATIONS — Agents │ Copilots │ Assistants │ RAG
TRANSPORT: MCP
▶▶ SODA CONTEXTURE ◀◀ — OCS: Operational Context for AI Agents
OPERATIONAL SYSTEMS — Prometheus │ Kubernetes │ S3 │ PostgreSQL │ SAP

AI can query your systems.
But at what cost?

MCP gives AI transport to your operational systems. But transport without context leads to guesswork, retries, and unreliable results.

Accuracy

AI guesses instead of knowing. Wrong table names, incorrect query syntax, missed relationships.

→ Queries `order` table instead of `orders`. Returns nothing.

Consistency

Different tools interpret the same data differently. No shared definitions or thresholds.

→ Tool A says CPU is "high" at 80%. Tool B says 60%.

Latency

Rediscovers schema on every request. Metadata fetch plus retries on every query.

→ 2-5 seconds added to every query.

Cost

Tokens spent on trial-and-error. Embedding entire schemas in prompts. Retry storms.

→ Embed entire schema in prompt. Costs add up fast.

Scale

Custom context wrappers per system. Maintenance burden grows linearly with systems.

→ Works for 1 system, breaks at 50.

Reliability

Can't reach production without trust. Your definitions don't exist to AI.

→ "Critical" means SLA < 99.9% to you. AI doesn't know.

Root cause: No standard tells AI how your systems work.

Context that makes AI work

Contexture provides operational context to AI agents. MCP provides transport. Together: AI agents that actually work.

deploy.sh
# Deploy Contexture docker run -d \ -e SOURCES="http://prometheus:9090,postgres://db:5432/orders" \ -p 8080:8080 \ sodafoundation/contexture # Auto-detects source types # Extracts context automatically
agent.py
import requests # Get ready-to-use context prompt response = requests.get( "http://contexture:8080/prompt/prometheus" ) context_prompt = response.text # Append to your agent's system prompt agent.system_prompt += context_prompt # That's it. Your agent now understands # your Prometheus instance.

OCS: A standard format for
operational context

Define once. Query consistently. Four primitives that capture everything AI needs to understand your systems.

Entity

What exists in your system

Tables, pods, buckets, metrics, services

Relationship

How things connect

Foreign keys, pod→deployment, cross-system joins

Semantics

What things mean and how to query

"Use rate() for counters" · "Critical = SLA < 99.9%"

Policy

Constraints AI should know

PII fields · GDPR scope · Retention rules · Lineage

Works with your stack

Native adapters for the operational systems you already use. Auto-extract what they can. Add your knowledge on top.

Observability

  • Prometheus
  • Grafana
  • Datadog
  • OpenTelemetry

Infrastructure

  • Kubernetes
  • Docker
  • Terraform
  • AWS / GCP / Azure

Databases

  • PostgreSQL
  • MySQL
  • MongoDB
  • InfluxDB

Storage

  • S3 / GCS
  • MinIO
  • NetApp
  • Pure Storage

Enterprise

  • SAP
  • ServiceNow
  • Salesforce
  • Workday

Data Platforms

  • Snowflake
  • Databricks
  • BigQuery
  • Airflow

Ready to give your agents context?

Join the community building the standard for operational context. Adapters, schemas, and production deployments welcome.