npx skills add https://github.com/akillness/oh-my-gods --skill opencontext赋予你的 AI 助手持久化记忆。停止重复解释,构建更智能的助手。
在与 AI 助手协作时,上下文会丢失(跨会话、跨仓库、跨日期)。最终你不得不重复背景信息,重新解释决策,有时助手还会基于错误的假设继续工作。
OpenContext 是一个为 AI 助手设计的轻量级个人上下文/知识存储。
[加载上下文] → [执行工作] → [存储结论]
| 项目 | 路径 |
|---|---|
| 上下文 | ~/.opencontext/contexts |
| 数据库 | ~/.opencontext/opencontext.db |
Give your AI assistant persistent memory. Stop repeating explanations, and build smarter.
When working with an AI assistant, context gets lost (across sessions, repos, and dates). You end up repeating background, re-explaining decisions, and sometimes the assistant continues with incorrect assumptions.
OpenContext is a lightweight personal context/knowledge store for AI assistants.
[Load context] → [Do work] → [Store conclusions]
| Item | Path |
|---|---|
| Contexts | ~/.opencontext/contexts |
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
npm install -g @aicontextlab/cli
# 或使用 npx
npx @aicontextlab/cli <command>
cd your-project
oc init
oc init 的作用:
| 命令 | 用途 |
|---|---|
/opencontext-help | 当你不知从何开始时 |
/opencontext-context | (推荐默认) 在工作前加载背景信息 |
/opencontext-search | 搜索现有文档 |
/opencontext-create | 创建新文档/想法 |
/opencontext-iterate | 存储结论和引用 |
# 斜杠命令
Cursor: ~/.cursor/commands
Claude Code: ~/.claude/commands
# 技能
Cursor: ~/.cursor/skills/opencontext-*/SKILL.md
Claude Code: ~/.claude/skills/opencontext-*/SKILL.md
Codex: ~/.codex/skills/opencontext-*/SKILL.md
# MCP 配置
Cursor: ~/.cursor/mcp.json
Claude Code: ~/.claude/mcp.json
# 列出文件夹
oc folder ls --all
# 创建文件夹
oc folder create project-a -d "My project"
# 创建文档
oc doc create project-a design.md -d "Design doc"
# 列出文档
oc doc ls project-a
# 搜索(关键词/混合/向量)
oc search "your query" --mode keyword --format json
# 生成清单(AI 应读取的文件列表)
oc context manifest project-a --limit 10
| 模式 | 描述 | 要求 |
|---|---|---|
--mode keyword | 基于关键词的搜索 | 无需嵌入向量 |
--mode vector | 向量搜索 | 需要嵌入向量 + 索引 |
--mode hybrid | 混合搜索(默认) | 需要嵌入向量 + 索引 |
# 设置 API 密钥
oc config set EMBEDDING_API_KEY "<<your_key>>"
# (可选)设置基础 URL
oc config set EMBEDDING_API_BASE "https://api.openai.com/v1"
# (可选)设置模型
oc config set EMBEDDING_MODEL "text-embedding-3-small"
# 构建索引
oc index build
LM Studio 提供了一个与 OpenAI 兼容的 /v1/embeddings 端点,因此 OpenContext 可以将其用作本地嵌入向量提供程序。
步骤 1 — 在 LM Studio 中加载嵌入模型
在 LM Studio 应用中,转到 Discover 并下载一个嵌入模型:
| 模型 | 大小 | 备注 |
|---|---|---|
nomic-ai/nomic-embed-text-v1.5-GGUF | ~90 MB | 通用,推荐 |
CompendiumLabs/bge-large-en-v1.5-gguf | ~330 MB | 更高准确度 |
second-state/All-MiniLM-L6-v2-Embedding-GGUF | ~23 MB | 轻量级 |
步骤 2 — 启动 LM Studio 服务器
lms server start
# 验证嵌入端点是否可用
curl http://localhost:1234/v1/models
步骤 3 — 配置 OpenContext
# 指向 LM Studio 的本地服务器
oc config set EMBEDDING_API_BASE "http://localhost:1234/v1"
# LM Studio 要求 key 字段为非空值
oc config set EMBEDDING_API_KEY "lm-studio"
# 设置嵌入模型 ID(必须与 LM Studio 中加载的模型匹配)
oc config set EMBEDDING_MODEL "nomic-ai/nomic-embed-text-v1.5-GGUF"
# 构建索引
oc index build
验证
# 快速 curl 测试
curl -s http://localhost:1234/v1/embeddings \
-H "Content-Type: application/json" \
-d '{"model":"nomic-ai/nomic-embed-text-v1.5-GGUF","input":"hello"}' \
| jq '.data[0].embedding | length'
# → 应打印一个非零的维度(例如 768)
# 然后测试 OpenContext 搜索
oc search "your query" --mode vector
注意:每当你运行
oc index build或使用--mode vector/--mode hybrid搜索时,LM Studio 必须正在运行。--mode keyword无需运行服务器即可工作。
oc_list_folders # 列出文件夹
oc_list_docs # 列出文档
oc_manifest # 生成清单
oc_search # 搜索文档
oc_create_doc # 创建文档
oc_get_link # 生成稳定链接
# Gemini:大规模分析
ask-gemini "Analyze the structure of the entire codebase"
# Codex:运行命令
shell "docker-compose up -d"
# OpenContext:存储结果
oc doc create project-a conclusions.md -d "Analysis conclusions"
[Claude] 规划
↓
[Gemini] 分析/研究 + OpenContext 搜索
↓
[Claude] 编写代码
↓
[Codex] 运行/测试
↓
[Claude] 综合结果 + 存储到 OpenContext
# 1. [Claude] 使用技能设计 API 规范
/opencontext-context # 加载项目背景
# 2. [Gemini] 分析大型代码库
ask-gemini "@src/ Analyze existing API patterns"
# 3. [Claude] 基于分析实现代码
# (使用从 OpenContext 加载的上下文)
# 4. [Codex] 测试和构建
shell "npm test && npm run build"
# 5. [Claude] 创建最终报告 + 存储结论
/opencontext-iterate # 存储决策和经验教训
/opencontext-context
/opencontext-search
/opencontext-iterate
通过引用文档 ID 来保持链接在重命名/移动时的稳定性:
[标签](oc://doc/<stable_id>)
oc doc link <doc_path>
oc_get_link doc_path="Product/api-spec"
引用功能:
| 操作 | 方法 | 结果 |
|---|---|---|
| 引用文本片段 | 选择文本 → 右键单击 → "复制引用" | 智能体读取片段 + 来源 |
| 引用文档 | 单击文档标题旁边的引用图标 | 智能体读取完整文档 + 获取 stable_id |
| 引用文件夹 | 右键单击文件夹 → "复制文件夹引用" | 智能体批量读取文件夹中的所有文档 |
oc ui
# 默认 URL: http://127.0.0.1:4321
工作前: /opencontext-context (加载背景)
工作中: /opencontext-search (搜索)
工作后: /opencontext-iterate (存储)
oc init # 初始化项目
oc folder ls --all # 列出文件夹
oc doc ls <folder> # 列出文档
oc search "query" # 搜索
oc doc create ... # 创建文档
oc_list_folders 列出文件夹
oc_list_docs 列出文档
oc_search 搜索
oc_manifest 清单
oc_create_doc 创建文档
oc_get_link 生成链接
~/.opencontext/contexts 上下文存储
~/.opencontext/opencontext.db 数据库
每周安装数
1
仓库
首次出现
1 天前
安全审计
安装于
mcpjam1
claude-code1
replit1
junie1
windsurf1
zencoder1
| Database | ~/.opencontext/opencontext.db |
npm install -g @aicontextlab/cli
# Or use npx
npx @aicontextlab/cli <command>
cd your-project
oc init
Whatoc init does:
| Command | Purpose |
|---|---|
/opencontext-help | When you don't know where to start |
/opencontext-context | (Recommended default) Load background before work |
/opencontext-search | Search existing documents |
/opencontext-create | Create a new document/idea |
/opencontext-iterate | Store conclusions and citations |
# Slash Commands
Cursor: ~/.cursor/commands
Claude Code: ~/.claude/commands
# Skills
Cursor: ~/.cursor/skills/opencontext-*/SKILL.md
Claude Code: ~/.claude/skills/opencontext-*/SKILL.md
Codex: ~/.codex/skills/opencontext-*/SKILL.md
# MCP Config
Cursor: ~/.cursor/mcp.json
Claude Code: ~/.claude/mcp.json
# List folders
oc folder ls --all
# Create folder
oc folder create project-a -d "My project"
# Create document
oc doc create project-a design.md -d "Design doc"
# List documents
oc doc ls project-a
# Search (keyword/hybrid/vector)
oc search "your query" --mode keyword --format json
# Generate a manifest (list of files the AI should read)
oc context manifest project-a --limit 10
| Mode | Description | Requirements |
|---|---|---|
--mode keyword | Keyword-based search | No embeddings required |
--mode vector | Vector search | Embeddings + index required |
--mode hybrid | Hybrid (default) | Embeddings + index required |
# Set API key
oc config set EMBEDDING_API_KEY "<<your_key>>"
# (Optional) Set base URL
oc config set EMBEDDING_API_BASE "https://api.openai.com/v1"
# (Optional) Set model
oc config set EMBEDDING_MODEL "text-embedding-3-small"
# Build index
oc index build
LM Studio exposes an OpenAI-compatible /v1/embeddings endpoint, so OpenContext can use it as a local embedding provider.
Step 1 — Load an embedding model in LM Studio
In the LM Studio app, go to Discover and download an embedding model:
| Model | Size | Notes |
|---|---|---|
nomic-ai/nomic-embed-text-v1.5-GGUF | ~90 MB | General purpose, recommended |
CompendiumLabs/bge-large-en-v1.5-gguf | ~330 MB | Higher accuracy |
second-state/All-MiniLM-L6-v2-Embedding-GGUF | ~23 MB | Lightweight |
Step 2 — Start the LM Studio server
lms server start
# Verify the embeddings endpoint is available
curl http://localhost:1234/v1/models
Step 3 — Configure OpenContext
# Point to LM Studio's local server
oc config set EMBEDDING_API_BASE "http://localhost:1234/v1"
# LM Studio requires any non-empty value for the key field
oc config set EMBEDDING_API_KEY "lm-studio"
# Set the embedding model ID (must match what is loaded in LM Studio)
oc config set EMBEDDING_MODEL "nomic-ai/nomic-embed-text-v1.5-GGUF"
# Build the index
oc index build
Verify
# Quick curl test
curl -s http://localhost:1234/v1/embeddings \
-H "Content-Type: application/json" \
-d '{"model":"nomic-ai/nomic-embed-text-v1.5-GGUF","input":"hello"}' \
| jq '.data[0].embedding | length'
# → should print a non-zero dimension (e.g. 768)
# Then test OpenContext search
oc search "your query" --mode vector
Note : LM Studio must be running whenever you run
oc index buildor use--mode vector/--mode hybridsearch.--mode keywordworks without a running server.
oc_list_folders # List folders
oc_list_docs # List documents
oc_manifest # Generate manifest
oc_search # Search documents
oc_create_doc # Create document
oc_get_link # Generate stable link
# Gemini: large-scale analysis
ask-gemini "Analyze the structure of the entire codebase"
# Codex: run commands
shell "docker-compose up -d"
# OpenContext: store results
oc doc create project-a conclusions.md -d "Analysis conclusions"
[Claude] Plan
↓
[Gemini] Analysis/research + OpenContext search
↓
[Claude] Write code
↓
[Codex] Run/test
↓
[Claude] Synthesize results + store in OpenContext
# 1. [Claude] Design API spec using the skill
/opencontext-context # Load project background
# 2. [Gemini] Analyze a large codebase
ask-gemini "@src/ Analyze existing API patterns"
# 3. [Claude] Implement code based on the analysis
# (Use context loaded from OpenContext)
# 4. [Codex] Test and build
shell "npm test && npm run build"
# 5. [Claude] Create final report + store conclusions
/opencontext-iterate # Store decisions and lessons learned
/opencontext-context
/opencontext-search
/opencontext-iterate
Keep links stable across renames/moves by referencing document IDs:
[label](oc://doc/<stable_id>)
oc doc link <doc_path>
oc_get_link doc_path="Product/api-spec"
Citation features:
| Action | How | Result |
|---|---|---|
| Cite text snippet | Select text → right-click → "Copy Citation" | Agent reads the snippet + source |
| Cite document | Click the citation icon next to the document title | Agent reads the full document + obtains stable_id |
| Cite folder | Right-click folder → "Copy Folder Citation" | Agent bulk-reads all docs in the folder |
oc ui
# Default URL: http://127.0.0.1:4321
Before: /opencontext-context (load background)
During: /opencontext-search (search)
After: /opencontext-iterate (store)
oc init # Initialize project
oc folder ls --all # List folders
oc doc ls <folder> # List documents
oc search "query" # Search
oc doc create ... # Create document
oc_list_folders list folders
oc_list_docs list documents
oc_search search
oc_manifest manifest
oc_create_doc create document
oc_get_link generate link
~/.opencontext/contexts context store
~/.opencontext/opencontext.db database
Weekly Installs
1
Repository
First Seen
1 day ago
Security Audits
Installed on
mcpjam1
claude-code1
replit1
junie1
windsurf1
zencoder1
React 组合模式指南:Vercel 组件架构最佳实践,提升代码可维护性
111,800 周安装