Skip to main content
本指南将帮助你开始使用 DigitalOcean Gradient 聊天模型

概述

集成详情

下载量版本
ChatGradientlangchain-gradientPyPI - DownloadsPyPI - Version

设置

langchain-gradient 使用 DigitalOcean Gradient 平台。 在 DigitalOcean 上创建账号,从 Gradient 平台获取 DIGITALOCEAN_INFERENCE_KEY API 密钥,并安装 langchain-gradient 集成包。

凭证

前往 DigitalOcean Gradient
  1. 注册/登录 DigitalOcean 云控制台
  2. 进入 Gradient 平台并导航到 Serverless Inference。
  3. 点击”创建模型访问密钥”,输入名称并创建密钥。
完成后设置 DIGITALOCEAN_INFERENCE_KEY 环境变量:
import os
os.environ["DIGITALOCEAN_INFERENCE_KEY"] = "your-api-key"

安装

LangChain Gradient 集成位于 langchain-gradient 包中:
pip install -qU langchain-gradient

实例化

from langchain_gradient import ChatGradient

llm = ChatGradient(
    model="llama3.3-70b-instruct",
    api_key=os.environ.get("DIGITALOCEAN_INFERENCE_KEY")
)

调用

messages = [
    (
        "system",
        "You are a creative storyteller. Continue any story prompt you receive in an engaging and imaginative way.",
    ),
    ("human", "Once upon a time, in a village at the edge of a mysterious forest, a young girl named Mira found a glowing stone..."),
]
ai_msg = llm.invoke(messages)
ai_msg
print(ai_msg.content)

链式调用

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
    [
        (
            "system",
            "You are a knowledgeable assistant. Carefully read the provided context and answer the user's question. If the answer is present in the context, cite the relevant sentence. If not, reply with \"Not found in context.\"",
        ),
        ("human", "Context: {context}\nQuestion: {question}"),
    ]
)

chain = prompt | llm
chain.invoke(
    {
        "context": (
            "The Eiffel Tower is located in Paris and was completed in 1889. "
            "It was designed by Gustave Eiffel's engineering company. "
            "The tower is one of the most recognizable structures in the world. "
            "The Statue of Liberty was a gift from France to the United States."
        ),
        "question": "Who designed the Eiffel Tower and when was it completed?"
    }
)