Skip to main content
LLM 可观测性目前处于公测阶段,其 API 可能会发生变化。
通过 Datadog LLM Observability,您可以监控、故障排除和评估您的 LLM 驱动的应用程序(如聊天机器人)。您可以调查问题的根本原因,监控运营性能,并评估 LLM 应用程序的质量、隐私和安全性。 这是一个实验性的社区实现,并非由 Datadog 官方支持。它基于 Datadog LLM Observability API

设置

有关安装 LangChain 包的通用说明,请参阅此部分
npm
npm install @langchain/community @langchain/core

用法

import { OpenAI } from "@langchain/openai";
import { DatadogLLMObsTracer } from "@langchain/community/experimental/callbacks/handlers/datadog";

/**
 * This example demonstrates how to use the DatadogLLMObsTracer with the OpenAI model.
 * It will produce a "llm" span with the input and output of the model inside the meta field.
 *
 * To run this example, you need to have a valid Datadog API key and OpenAI API key.
 */
export const run = async () => {
  const model = new OpenAI({
    model: "gpt-4",
    temperature: 0.7,
    maxTokens: 1000,
    maxRetries: 5,
  });

  const res = await model.invoke(
    "Question: What would be a good company name a company that makes colorful socks?\nAnswer:",
    {
      callbacks: [
        new DatadogLLMObsTracer({
          mlApp: "my-ml-app",
        }),
      ],
    }
  );

  console.log({ res });
};