Skip to main content
The following sections help you set up and use tracing, monitoring, and observability features:
LangSmith works with many frameworks and providers. Browse available integrations to connect your stack including OpenAI, Anthropic, CrewAI, Vercel AI SDK, Pydantic AI, and more.

Set up tracing

Configure tracing with basic options, framework integrations, or advanced settings for full control.

View traces

Access and manage traces via UI or API with filtering, exporting, sharing, and comparison tools.

Monitor performance

Create dashboards and set alerts to track performance and get notified when issues arise.

Configure automations

Use rules, webhooks, and online evaluations to streamline observability workflows.

Collect feedback

Gather and manage annotations on outputs using queues and inline annotation.

Trace a RAG app

Follow a step-by-step tutorial to trace a Retrieval-Augmented Generation application from start to finish.
For terminology definitions and core concepts, refer to Observability concepts.
Use Polly, LangSmith’s AI assistant, to analyze traces and get AI-powered insights into your application’s performance.
要设置 LangSmith 实例,请访问 平台设置部分 以选择云、混合或自托管选项。所有选项都包括可观测性、评估、提示工程和部署。