Skip to main content
The LangSmith MCP Server is a Model Context Protocol (MCP) server that integrates with LangSmith. It lets MCP-compatible clients (for example, AI coding assistants) read conversation history, prompts, runs and traces, datasets, experiments, and billing usage from your LangSmith workspace.

Example use cases

  • Conversation history: “Fetch the history of my conversation from thread ‘thread-123’ in project ‘my-chatbot’”
  • Prompt management: “Get all public prompts” or “Pull the template for the ‘legal-case-summarizer’ prompt”
  • Traces and runs: “Fetch the latest 10 root runs from project ‘alpha’” or “Get all runs for a trace by UUID”
  • Datasets: “List datasets of type chat” or “Read examples from dataset ‘customer-support-qa’”
  • Experiments: “List experiments for dataset ‘my-eval-set’ with latency and cost metrics”
  • Billing: “Get billing usage for September 2025”
Use the server in code or Fleet

Quickstart (hosted)

A hosted version of the LangSmith MCP Server is available over HTTP, so you can connect without running the server yourself.
  • URL: https://langsmith-mcp-server.onrender.com/mcp
  • Authentication: Send your LangSmith API key in the LANGSMITH-API-KEY header.
The hosted instance is for LangSmith Cloud. For a self-hosted LangSmith instance, run the server yourself and point it at your endpoint (see Docker deployment).
Example (Cursor mcp.json):
{
  "mcpServers": {
    "LangSmith MCP (Hosted)": {
      "url": "https://langsmith-mcp-server.onrender.com/mcp",
      "headers": {
        "LANGSMITH-API-KEY": "lsv2_pt_your_api_key_here"
      }
    }
  }
}
Optional headers: LANGSMITH-WORKSPACE-ID, LANGSMITH-ENDPOINT (same as in Environment variables).

Available tools

Conversation and threads

ToolDescription
get_thread_historyGet message history for a conversation thread. Uses character-based pagination: pass page_number (1-based) and use the returned total_pages to request more pages. Optional: max_chars_per_page, preview_chars.

Prompt management

ToolDescription
list_promptsList prompts with optional filtering by visibility (public/private) and limit.
get_prompt_by_nameGet a single prompt by exact name (details and template).
push_promptDocumentation-only: how to create and push prompts to LangSmith.

Traces and runs

ToolDescription
fetch_runsFetch runs (traces, tools, chains, etc.) from one or more projects. Supports filters (run_type, error, is_root), FQL (filter, trace_filter, tree_filter), and ordering. When trace_id is set, results are character-based paginated; otherwise one batch up to limit. Always pass limit and page_number.
list_projectsList projects with optional filtering by name, dataset, and detail level.

Datasets and examples

ToolDescription
list_datasetsList datasets with filtering by ID, type, name, or metadata.
list_examplesList examples from a dataset by dataset ID/name or example IDs; supports filter, metadata, splits, and optional as_of version.
read_datasetRead one dataset by ID or name.
read_exampleRead one example by ID, with optional as_of version.
create_datasetDocumentation-only: how to create datasets.
update_examplesDocumentation-only: how to update dataset examples.

Experiments and evaluations

ToolDescription
list_experimentsList experiment (reference) projects for a dataset. Requires reference_dataset_id or reference_dataset_name. Returns metrics (latency, cost, feedback).
run_experimentDocumentation-only: how to run experiments and evaluations.

Billing

ToolDescription
get_billing_usageGet organization billing usage (e.g. trace counts) for a date range. Optional workspace filter.

Pagination (character-based)

Tools that return large payloads use character-budget pagination so responses stay within a size limit:
  • Used by: get_thread_history and fetch_runs (when trace_id is set).
  • Parameters: Send page_number (1-based) on each request. Optional: max_chars_per_page (default 25000, max 30000), preview_chars (truncate long strings with ”… (+N chars)”).
  • Response: Includes page_number, total_pages, and the page payload. Request more by calling again with page_number = 2, then 3, up to total_pages.
  • Benefits: Pages are built by character count, not item count; no cursor or server-side state—just page numbers.

Installation (run locally)

If you prefer to run the server locally (or use a self-hosted LangSmith endpoint), install it and configure your MCP client.

Prerequisites

  1. Install uv (Python package installer):
    curl -LsSf https://astral.sh/uv/install.sh | sh
    
  2. Install the package:
    uv run pip install --upgrade langsmith-mcp-server
    

MCP client configuration

Add the server to your MCP client config. Use the path from which uvx for the command value. PyPI / uvx:
{
  "mcpServers": {
    "LangSmith API MCP Server": {
      "command": "/path/to/uvx",
      "args": ["langsmith-mcp-server"],
      "env": {
        "LANGSMITH_API_KEY": "your_langsmith_api_key",
        "LANGSMITH_WORKSPACE_ID": "your_workspace_id",
        "LANGSMITH_ENDPOINT": "https://api.smith.langchain.com"
      }
    }
  }
}
From source (clone langsmith-mcp-server first):
{
  "mcpServers": {
    "LangSmith API MCP Server": {
      "command": "/path/to/uv",
      "args": [
        "--directory",
        "/path/to/langsmith-mcp-server",
        "run",
        "langsmith_mcp_server/server.py"
      ],
      "env": {
        "LANGSMITH_API_KEY": "your_langsmith_api_key",
        "LANGSMITH_WORKSPACE_ID": "your_workspace_id",
        "LANGSMITH_ENDPOINT": "https://api.smith.langchain.com"
      }
    }
  }
}
Replace /path/to/uv, /path/to/uvx, and /path/to/langsmith-mcp-server with your actual paths.

Docker deployment (HTTP-streamable)

You can run the server as an HTTP service with Docker so clients connect via the HTTP-streamable protocol.
  1. Build and run:
    docker build -t langsmith-mcp-server .
    docker run -p 8000:8000 langsmith-mcp-server
    
    Use the langsmith-mcp-server repository for the Dockerfile and context.
  2. Connect your MCP client to http://localhost:8000/mcp with the LANGSMITH-API-KEY header (and optional LANGSMITH-WORKSPACE-ID, LANGSMITH-ENDPOINT).
  3. Health check (no auth):
    curl http://localhost:8000/health
    
For full Docker and HTTP-streamable details, see the LangSmith MCP Server repository.

Deployment overview

Use the hosted MCP server to connect to LangSmith Cloud (smith.langchain.com or eu.smith.langchain.com). To connect to Cloud or self-hosted LangSmith, run the server locally and set LANGSMITH_ENDPOINT. For self-hosted deployments, you can also run the server via the Docker image inside your VPC.

Environment variables

VariableRequiredDescription
LANGSMITH_API_KEYYesYour LangSmith API key for authentication.
LANGSMITH_WORKSPACE_IDNoWorkspace ID when your API key has access to multiple workspaces.
LANGSMITH_ENDPOINTNoAPI endpoint URL (for self-hosted or custom regions). Default: https://api.smith.langchain.com.
For the hosted server, use the same names as headers: LANGSMITH-API-KEY, LANGSMITH-WORKSPACE-ID, LANGSMITH-ENDPOINT.

TypeScript implementation

A community-maintained TypeScript/Node.js port of the official Python server is available. To run it: LANGSMITH_API_KEY=your-key npx langsmith-mcp-server. Source and package: GitHub · npm. Maintained by amitrechavia.