Your agents. Your infrastructure.
Your data.

Deployment Architecture

Runs Where Your Data Lives

Deploy the full Introspection stack inside your VPC with Terraform and Helm. Customer-managed encryption keys, private networking, and zero-secrets architecture — Workload Identity, IRSA, or Managed Identity instead of passwords.

Your VPC / Private Cloud
Data Plane API
Task Orchestration
Trace Ingestion
Agent Management
Agent Sandboxes
Ephemeral Containers
Egress Control
Domain Whitelisting
LLM Gateway
Bring Your Own Keys
Per-Org Budget Caps
Usage Tracking
Storage
ClickHouse (BYO or Managed)
PostgreSQL (CMEK)
Redis / Valkey (TLS)
Encrypted Connection
TLS encrypted
Outbound only
No data egress
Introspection Cloud
Dashboard & Analytics
Auth & Access Control
Deployment Management
Enterprise Features

Built for Enterprise Requirements

Every feature designed for teams that need security, compliance, and operational excellence when running AI agents at scale.

Data Sovereignty

All trace data, agent telemetry, and conversation logs stay within your cloud account. Private networking on all databases and caches — no public endpoints, ever.

Bring Your Own Keys

Use your own API keys for Anthropic, OpenAI, Gemini, Azure OpenAI, Bedrock, and Vertex AI. Traffic routes through your accounts — full cost control and compliance.

Bring Your Own ClickHouse

Point Introspection at your existing ClickHouse cluster, or let us deploy one in your VPC. All OpenTelemetry trace data stays where you need it.

Customer-Managed Encryption

Customer-managed encryption keys (CMEK) on AWS KMS, GCP Cloud KMS, and Azure Key Vault. Covers disks, databases, caches, object storage, and Kubernetes secrets.

Role-Based Access

Three-tier RBAC — Owner, Admin, Member — with 20+ fine-grained scopes. OIDC-based authentication with JWT-signed tokens between control and data planes.

Multi-Cloud Deployment

Deploy on AWS, GCP, or Azure with the same Helm charts and identical security posture. Choose from 16+ regions with S/M/L sizing tiers.

Sandbox Isolation

Every agent task runs in an ephemeral container with egress control. Domain-whitelisted HTTP proxy, dedicated node pools, and automatic cleanup.

Cost Analytics & Budgets

Track LLM token usage, compute-seconds, and cost per agent through the built-in LLM gateway. Set per-org spending caps with automatic enforcement.

Deployment Options

Deploy Your Way

Choose the deployment model that matches your security posture. Migrate between options as your needs evolve — no vendor lock-in.

Managed Cloud
We handle everything
  • Zero infrastructure to provision or maintain
  • Automatic upgrades, patches, and scaling
  • Managed sandbox orchestration with warm pools
  • Built-in LLM gateway with cost tracking
  • Multi-region availability
Self-Hosted
Full ownership
Everything in Hybrid, plus:
  • Control plane in your account
  • Auth, SSO, and member management
  • Billing and usage tracking
  • Deployment provisioning and GitOps
  • Integration credential storage

Ready to start improving your AI systems?

Talk to our team about enterprise deployment options for your organization.