INFO:langgraph_api.cli: Welcome to╦ ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬║ ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴ ┴ ┴- 🚀 API: http://127.0.0.1:2024- 🎨 Studio UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024- 📚 API Docs: http://127.0.0.1:2024/docsThis in-memory server is designed for development and testing.For production use, please use LangSmith Deployment.
langgraph dev 命令以内存模式启动 Agent Server。此模式适用于开发和测试目的。对于生产使用,请使用具有持久存储后端的 Agent Server 进行部署。有关更多信息,请参阅平台设置概述。
from langgraph_sdk import get_clientimport asyncioclient = get_client(url="http://localhost:2024")async def main(): async for chunk in client.runs.stream( None, # Threadless run "agent", # Name of assistant. Defined in langgraph.json. input={ "messages": [{ "role": "human", "content": "What is LangGraph?", }], }, ): print(f"Receiving new event of type: {chunk.event}...") print(chunk.data) print("\n\n")asyncio.run(main())
安装 LangGraph Python SDK:
Copy
pip install langgraph-sdk
向助手发送消息(无线程运行):
Copy
from langgraph_sdk import get_sync_clientclient = get_sync_client(url="http://localhost:2024")for chunk in client.runs.stream( None, # Threadless run "agent", # Name of assistant. Defined in langgraph.json. input={ "messages": [{ "role": "human", "content": "What is LangGraph?", }], }, stream_mode="messages-tuple",): print(f"Receiving new event of type: {chunk.event}...") print(chunk.data) print("\n\n")