Skip to content

LangSmith Integration

Overview

Import traces and feedback from your LangSmith projects. Converra extracts the system prompt and a readable transcript so you can optimize based on real production behavior.

What Gets Imported

Converra imports multi-turn conversational traces from LangSmith:

  • Traces: Agent execution traces with 2+ user messages (deduped by trace ID)
  • Prompt: The first system prompt found in the trace (used to create/dedupe a Converra Prompt)
  • Transcript: A flattened, readable conversation built from LLM inputs/outputs
  • Root feedback: LangSmith feedback attached to the root trace (if present)
  • Basic LLM params: model + sampling params when available

Note

Tool calls are captured and displayed in conversation details. Full tool schema preservation for replay is on the roadmap.

What Gets Skipped

Converra automatically filters out traces that aren't conversational:

Trace TypeExample Use CaseAction
No system promptEmbeddings, simple completionsSkipped
Single-turn (1 user message)Summarization, classification, extractionSkipped
Multi-turn (2+ user messages)Chatbots, support agents, copilotsImported

Why? Converra optimizes conversational AI through multi-turn simulation testing. Single Q&A API calls (like "summarize this document") don't benefit from this approach because:

  • Our simulation has personas ask follow-up questions — irrelevant for single-turn tasks
  • Our evaluation scores conversation quality, not task accuracy
  • These traces create noise without providing optimization value

If you primarily have single-turn use cases, contact us — task-based evaluation is on our roadmap.

How It Works

  1. Connect - Paste your LangSmith API key
  2. Select project - Choose which LangSmith project to import from
  3. Enable sync - Turn on continuous sync or import manually
  4. Review prompts - Prompts are deduped by content hash and shown in Converra for optimization

Continuous Sync

Once connected, Converra can automatically import new traces from your LangSmith project.

Configuration

SettingOptionsDefault
Sync FrequencyHourly, Every 6 hours, DailyDaily
Preferred TimeAny hour (UTC)Midnight UTC

How It Works

  • Converra tracks the last imported trace timestamp (watermark)
  • Each sync fetches only new traces since the last import
  • Duplicates are automatically skipped (by trace ID)
  • Insights are generated in the background after import

Manual Import

You can also trigger imports manually at any time from the Integrations page. This is useful for:

  • Initial bulk import before enabling continuous sync
  • Importing after a sync was paused
  • Testing the connection

Compatibility

Converra auto-detects your tracing architecture and applies the right import strategy.

Supported Architectures

ArchitectureDescriptionStatus
Simple chatbotSingle agent with native messages✅ Supported
Multi-agent (per-trace)Multiple LLM runs in a single trace✅ Supported
Multi-agent (session-based)Traces grouped by inputs.session_id✅ Supported
Embedded historyConversation in inputs.history or <conversation_history> tags✅ Supported

Supported LLM Formats

Converra parses these input/output formats automatically:

Inputs:

  • OpenAI: inputs.messages[] with role/content
  • Anthropic: inputs.system + inputs.messages
  • LangChain: inputs.human, inputs.input, inputs.question

Outputs:

  • OpenAI: outputs.choices[0].message.content
  • Anthropic: outputs.content[] text blocks
  • LangChain: outputs.output, outputs.text, outputs.generations[]
  • Structured JSON: html_response field extracted automatically

Agent Role Detection

For multi-agent systems, Converra classifies agents by name:

  • Routers: Names containing "orchestrator", "router", "coordinator", "dispatcher"
  • Responders: All other agents (these are treated as conversational)

Not Yet Supported

  • thread_id in metadata (use inputs.session_id instead)
  • Custom session field locations

Have a setup that's not working? Contact us and we'll help.

Multi-Agent Support

LangSmith projects often contain multi-agent traces (Router → SupportBot → RefundHandler).

Converra discovers all LLM runs in a trace and groups them into agent systems:

  • Agent Discovery - See which prompts appear together in traces
  • Path Visualization - View the actual execution paths (e.g., Router → Support → Escalation)
  • Per-Prompt Scores - Each prompt in the system gets its own performance metrics
  • Prompt Type Classification - Conversational prompts vs single-shot (classification, extraction, etc.)

Navigate to the Agents page to see discovered agent systems and their performance.

Polly (LangSmith Assistant)

Polly is great for debugging and prompt editing inside LangSmith. Converra is complementary: we turn traces into measurable, gated improvements you can ship with confidence.

Getting Started

  1. Request early access to enable the connector
  2. Go to Integrations in the Converra app
  3. Click Connect LangSmith
  4. Follow the setup wizard

API Key Types

LangSmith supports two types of API keys:

The simplest option. Go to Settings → Personal → API Keys in LangSmith and create a new key. This works immediately without additional configuration.

Organization-Scoped API Key

If you need organization-level access control, you can use an org-scoped API key:

  1. Create an org-scoped API key in Settings → Organization → API Keys
  2. Paste the key in Converra

Converra automatically detects your workspace when using org-scoped keys. If you have multiple workspaces, you can manually specify which one to use in Advanced options by entering the Workspace ID (found in Settings → Workspaces).

FAQ

Do I need to change my LangSmith instrumentation?

No. If you already trace runs and collect feedback in LangSmith, Converra imports what you have. You can optionally add more detail for better replay fidelity.

How often does Converra sync?

You choose: hourly, every 6 hours, or daily (default). Syncs are incremental—only new traces since the last import are fetched. You can also trigger manual imports anytime.

What about Langfuse?

Langfuse is now supported! See the Langfuse Integration guide. The workflow is similar to LangSmith, with a few differences (Basic Auth, project-scoped API keys, region selection).