Skip to main content
DeepInfra 是一种无服务器推理即服务平台,提供对各种 LLM嵌入模型的访问。本 notebook 介绍如何使用 LangChain 与 DeepInfra 一起进行语言模型推理。

设置环境 API Key

请务必从 DeepInfra 获取您的 API key。您需要登录并获取新的令牌。 您将获得 1 小时的免费无服务器 GPU 计算资源来测试不同的模型。(参见此处) 您可以使用 deepctl auth token 打印您的令牌。
# get a new token: https://deepinfra.com/login?from=%2Fdash

from getpass import getpass

DEEPINFRA_API_TOKEN = getpass()
 ········
import os

os.environ["DEEPINFRA_API_TOKEN"] = DEEPINFRA_API_TOKEN

创建 DeepInfra 实例

您也可以使用我们的开源 deepctl 工具来管理模型部署。可以在此处查看可用参数列表。
from langchain_community.llms import DeepInfra

llm = DeepInfra(model_id="meta-llama/Llama-2-70b-chat-hf")
llm.model_kwargs = {
    "temperature": 0.7,
    "repetition_penalty": 1.2,
    "max_new_tokens": 250,
    "top_p": 0.9,
}
# run inferences directly via wrapper
llm("Who let the dogs out?")
'This is a question that has puzzled many people'
# run streaming inference
for chunk in llm.stream("Who let the dogs out?"):
    print(chunk)
 Will
 Smith
.

创建提示词模板

我们将为问答创建一个提示词模板。
from langchain_core.prompts import PromptTemplate

template = """Question: {question}

Answer: Let's think step by step."""

prompt = PromptTemplate.from_template(template)

初始化 LLMChain

from langchain_classic.chains import LLMChain

llm_chain = LLMChain(prompt=prompt, llm=llm)

运行 LLMChain

提供一个问题并运行 LLMChain。
question = "Can penguins reach the North pole?"

llm_chain.run(question)
"Penguins are found in Antarctica and the surrounding islands, which are located at the southernmost tip of the planet. The North Pole is located at the northernmost tip of the planet, and it would be a long journey for penguins to get there. In fact, penguins don't have the ability to fly or migrate over such long distances. So, no, penguins cannot reach the North Pole. "