重要前提
安装AI Skills的关键前提是:必须科学上网,且开启TUN模式,这一点至关重要,直接决定安装能否顺利完成,在此郑重提醒三遍:科学上网,科学上网,科学上网。查看完整安装教程 →
letta-api-client by letta-ai/skills
npx skills add https://github.com/letta-ai/skills --skill letta-api-client基于 Letta API 构建应用程序——这是一个模型无关、有状态的 API,用于构建具有记忆和长期学习能力的持久性智能体。Letta API 为 Letta Code 和 Learning SDK 提供支持。本技能涵盖了创建智能体、管理记忆、构建自定义工具以及处理多用户场景的核心模式。
首次设置和常见入门问题,请参阅 getting-started.md。
示例最后测试于:
letta-client==1.7.1@letta-ai/letta-client@1.7.1初始化模式请参阅 client-setup.md:
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
记忆模式请参阅 memory-architecture.md:
工具创建请参阅 custom-tools.md:
本地工具执行请参阅 client-side-tools.md:
type: "tool" 响应的审批流程服务器端工具模式请参阅 client-injection.md:
client 变量os.getenv() 获取智能体密钥LETTA_AGENT_ID扩展模式请参阅 multi-user.md:
实时响应请参阅 streaming.md:
include_pings 的长时间运行操作对话支持具有共享记忆的并行会话:
后台记忆处理请参阅 sleeptime.md:
enable_sleeptime=True 启用可移植性和文件访问请参阅 agent-files.md:
.af 文件导出/导入智能体约束工具执行请参阅 tool-rules.md:
InitToolRule - 强制首先运行某个工具ChildToolRule - 控制哪些工具可以跟随执行TerminalToolRule - 工具执行后结束智能体回合pip install letta-client
from letta_client import Letta
# Cloud
client = Letta(api_key="LETTA_API_KEY")
# Self-hosted
client = Letta(base_url="http://localhost:8283")
npm install @letta-ai/letta-client
import { Letta } from "@letta-ai/letta-client";
// Cloud
const client = new Letta({ apiKey: process.env.LETTA_API_KEY });
// Self-hosted
const client = new Letta({ baseUrl: "http://localhost:8283" });
可运行代码请参阅 examples/ 目录:
Python:
01_basic_client.py - 客户端初始化02_create_agent.py - 带记忆块的智能体创建03_custom_tool_simple.py - 基本自定义工具04_custom_tool_secrets.py - 带环境变量的工具05_send_message.py - 基本消息发送06_send_message_stream.py - 流式响应07_multi_user.py - 多用户模式08_archival_memory.py - 归档记忆操作09_shared_blocks.py - 多智能体共享记忆10_conversations.py - 使用对话的并行会话11_client_injection.py - 使用注入客户端创建自定义记忆工具12_tool_rules.py - 约束工具执行顺序13_client_side_tools.py - 本地执行工具(类似 Letta Code)TypeScript:
01_basic_client.ts - 客户端初始化02_create_agent.ts - 智能体创建03_send_message.ts - 基本消息发送04_send_message_stream.ts - 流式传输05_nextjs_singleton.ts - Next.js 模式06_multi_user.ts - 多用户模式07_conversations.ts - 并行会话08_custom_tool.ts - 带密钥的自定义工具09_archival_memory.ts - 长期存储10_shared_blocks.ts - 多智能体共享记忆11_client_injection.ts - 自定义记忆工具12_tool_rules.ts - 工具执行顺序13_client_side_tools.ts - 本地执行工具(类似 Letta Code)| 错误 | 原因 | 修复方法 |
|---|---|---|
| 401 未授权 | API 密钥无效或缺失 | 检查 LETTA_API_KEY 环境变量 |
| 422 验证错误 | 缺少必填字段 | 添加 model、embedding 或 memory_blocks |
| 工具未找到 | 工具未附加到智能体 | client.agents.tools.attach(agent_id, tool_id) |
os.getenv() 返回 None | 密钥未配置 | 通过 secrets 参数添加到智能体 |
| 524 超时 | 长时间操作未发送 ping | 在流式传输中添加 include_pings=True |
| 智能体无响应 | 模型问题或空响应 | 检查响应中是否有 assistant_message 类型 |
| 记忆块未更新 | 查看错误的智能体 | 验证 agent_id 是否匹配 |
| 工具中导入错误 | 顶级导入 | 将导入移到函数体内 |
os.getenv() 获取密钥 - 不要将敏感数据作为函数参数传递client - 不要在工具内部实例化 Letta(),使用预注入的客户端include_pings=True - 防止 Cloud 超时.update() 而非 .modify() - 方法已重命名LETTA_AGENT_ID 始终可用 - 在工具中使用它来引用当前智能体include_base_tools=True - 默认不附加memory_insert 处理共享块 - 对于并发写入最安全(仅追加)// 客户端初始化使用 baseURL(不是 baseUrl)
const client = new Letta({ apiKey: "...", baseURL: "http://localhost:8283" });
// Block API:位置参数已更改
client.agents.blocks.attach(blockId, { agent_id }); // blockId 是第一个参数
client.agents.blocks.retrieve(blockLabel, { agent_id }); // label 是第一个参数
// Passages.create 返回数组
const passages = await client.agents.passages.create(agentId, { text: "..." });
const passage = passages[0];
// Content 可以是 string | array - 使用类型守卫
const content = typeof msg.content === "string" ? msg.content : JSON.stringify(msg.content);
// Conversations API 默认返回流
const stream = await client.conversations.messages.create(convId, { messages: [...] });
for await (const chunk of stream) { ... }
// 工具规则类型
{ type: "run_first", tool_name: "..." } // InitToolRule
{ type: "constrain_child_tools", tool_name: "...", children: [...] } // ChildToolRule
{ type: "exit_loop", tool_name: "..." } // TerminalToolRule
# Client
client = Letta(api_key=os.getenv("LETTA_API_KEY"))
# Create agent
agent = client.agents.create(
model="anthropic/claude-sonnet-4-5-20250929",
embedding="openai/text-embedding-3-small",
memory_blocks=[{"label": "persona", "value": "..."}],
include_base_tools=True, # archival memory tools
enable_sleeptime=True, # background memory processing
)
# Send message
response = client.agents.messages.create(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}]
)
# Stream response
stream = client.agents.messages.stream(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}],
stream_tokens=True,
include_pings=True, # prevent timeout
)
# Create tool
tool = client.tools.create(source_code="def my_tool(x: str) -> str: ...")
client.agents.tools.attach(agent_id=agent.id, tool_id=tool.id)
# Memory blocks
client.agents.blocks.retrieve(agent_id=agent.id, block_label="persona")
client.agents.blocks.update(agent_id=agent.id, block_label="persona", value="...")
# Folders
folder = client.folders.create(name="docs")
client.folders.files.upload(file=f, folder_id=folder.id)
client.agents.folders.attach(agent_id=agent.id, folder_id=folder.id)
# Conversations (parallel sessions)
conv = client.conversations.create(agent_id=agent.id)
stream = client.conversations.messages.create(conv.id, messages=[...])
# Agent secrets (for tools)
client.agents.update(agent_id=agent.id, secrets={"API_KEY": "..."})
平台:
文档:
SDKs:
pip install letta-clientnpm install @letta-ai/letta-client示例:
每周安装数
65
代码仓库
GitHub 星标数
78
首次出现
2026年2月4日
安全审计
安装于
codex61
opencode61
gemini-cli59
github-copilot58
kimi-cli57
amp57
Build applications on top of the Letta API — a model-agnostic, stateful API for building persistent agents with memory and long-term learning. The Letta API powers Letta Code and the Learning SDK. This skill covers the core patterns for creating agents, managing memory, building custom tools, and handling multi-user scenarios.
See getting-started.md for first-time setup and common onboarding issues.
Examples last tested with:
letta-client==1.7.1@letta-ai/letta-client@1.7.1See client-setup.md for initialization patterns:
See memory-architecture.md for memory patterns:
See custom-tools.md for tool creation:
See client-side-tools.md for local tool execution:
type: "tool" responsesSee client-injection.md for server-side tool patterns:
client variable on Letta Cloudos.getenv()LETTA_AGENT_ID for self-referential toolsSee multi-user.md for scaling:
See streaming.md for real-time responses:
include_pingsConversations enable parallel sessions with shared memory:
See sleeptime.md for background memory processing:
enable_sleeptime=TrueSee agent-files.md for portability and file access:
.af filesSee tool-rules.md for constraining tool execution:
InitToolRule - Force a tool to run firstChildToolRule - Control which tools can followTerminalToolRule - End agent turn after toolpip install letta-client
from letta_client import Letta
# Cloud
client = Letta(api_key="LETTA_API_KEY")
# Self-hosted
client = Letta(base_url="http://localhost:8283")
npm install @letta-ai/letta-client
import { Letta } from "@letta-ai/letta-client";
// Cloud
const client = new Letta({ apiKey: process.env.LETTA_API_KEY });
// Self-hosted
const client = new Letta({ baseUrl: "http://localhost:8283" });
See the examples/ directory for runnable code:
Python:
01_basic_client.py - Client initialization02_create_agent.py - Agent creation with memory blocks03_custom_tool_simple.py - Basic custom tool04_custom_tool_secrets.py - Tool with environment variables05_send_message.py - Basic messaging06_send_message_stream.py - Streaming responses07_multi_user.py - Multi-user patterns08_archival_memory.py - Archival memory operations09_shared_blocks.py - Multi-agent shared memoryTypeScript:
01_basic_client.ts - Client initialization02_create_agent.ts - Agent creation03_send_message.ts - Basic messaging04_send_message_stream.ts - Streaming05_nextjs_singleton.ts - Next.js pattern06_multi_user.ts - Multi-user patterns07_conversations.ts - Parallel sessions08_custom_tool.ts - Custom tools with secrets09_archival_memory.ts - Long-term storage| Error | Cause | Fix |
|---|---|---|
| 401 Unauthorized | Invalid or missing API key | Check LETTA_API_KEY env var |
| 422 Validation Error | Missing required field | Add model, embedding, or memory_blocks |
| Tool not found | Tool not attached to agent | client.agents.tools.attach(agent_id, tool_id) |
os.getenv() returns None |
os.getenv() for secrets - Don't pass sensitive data as function argumentsclient - Don't instantiate Letta() inside tools, use the pre-injected clientinclude_pings=True for long operations - Prevents timeout on Cloud.update() not .modify() - Method was renamedLETTA_AGENT_ID is always available - Use it in tools to reference the current agent// Client initialization uses baseURL (not baseUrl)
const client = new Letta({ apiKey: "...", baseURL: "http://localhost:8283" });
// Block API: positional args changed
client.agents.blocks.attach(blockId, { agent_id }); // blockId is first
client.agents.blocks.retrieve(blockLabel, { agent_id }); // label is first
// Passages.create returns array
const passages = await client.agents.passages.create(agentId, { text: "..." });
const passage = passages[0];
// Content can be string | array - use type guard
const content = typeof msg.content === "string" ? msg.content : JSON.stringify(msg.content);
// Conversations API returns streams by default
const stream = await client.conversations.messages.create(convId, { messages: [...] });
for await (const chunk of stream) { ... }
// Tool rule types
{ type: "run_first", tool_name: "..." } // InitToolRule
{ type: "constrain_child_tools", tool_name: "...", children: [...] } // ChildToolRule
{ type: "exit_loop", tool_name: "..." } // TerminalToolRule
# Client
client = Letta(api_key=os.getenv("LETTA_API_KEY"))
# Create agent
agent = client.agents.create(
model="anthropic/claude-sonnet-4-5-20250929",
embedding="openai/text-embedding-3-small",
memory_blocks=[{"label": "persona", "value": "..."}],
include_base_tools=True, # archival memory tools
enable_sleeptime=True, # background memory processing
)
# Send message
response = client.agents.messages.create(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}]
)
# Stream response
stream = client.agents.messages.stream(
agent_id=agent.id,
messages=[{"role": "user", "content": "Hello"}],
stream_tokens=True,
include_pings=True, # prevent timeout
)
# Create tool
tool = client.tools.create(source_code="def my_tool(x: str) -> str: ...")
client.agents.tools.attach(agent_id=agent.id, tool_id=tool.id)
# Memory blocks
client.agents.blocks.retrieve(agent_id=agent.id, block_label="persona")
client.agents.blocks.update(agent_id=agent.id, block_label="persona", value="...")
# Folders
folder = client.folders.create(name="docs")
client.folders.files.upload(file=f, folder_id=folder.id)
client.agents.folders.attach(agent_id=agent.id, folder_id=folder.id)
# Conversations (parallel sessions)
conv = client.conversations.create(agent_id=agent.id)
stream = client.conversations.messages.create(conv.id, messages=[...])
# Agent secrets (for tools)
client.agents.update(agent_id=agent.id, secrets={"API_KEY": "..."})
Platform:
Documentation:
SDKs:
pip install letta-clientnpm install @letta-ai/letta-clientExamples:
Weekly Installs
65
Repository
GitHub Stars
78
First Seen
Feb 4, 2026
Security Audits
Gen Agent Trust HubWarnSocketPassSnykWarn
Installed on
codex61
opencode61
gemini-cli59
github-copilot58
kimi-cli57
amp57
SoulTrace 人格评估 API - 基于五色心理模型的贝叶斯自适应测试
56,700 周安装
10_conversations.py - Parallel sessions with conversations11_client_injection.py - Custom memory tools with injected client12_tool_rules.py - Constraining tool execution order13_client_side_tools.py - Execute tools locally (like Letta Code)10_shared_blocks.ts - Multi-agent shared memory11_client_injection.ts - Custom memory tools12_tool_rules.ts - Tool execution order13_client_side_tools.ts - Execute tools locally (like Letta Code)| Secret not configured |
Add to agent via secrets parameter |
| 524 Timeout | Long operation without pings | Add include_pings=True to streaming |
| Agent not responding | Model issue or empty response | Check for assistant_message type in response |
| Memory block not updating | Looking at wrong agent | Verify agent_id matches |
| Import error in tool | Top-level import | Move imports inside function body |
include_base_tools=True - Not attached by defaultmemory_insert for shared blocks - Safest for concurrent writes (append-only)