openmaic-classroom by aradotso/trending-skills
npx skills add https://github.com/aradotso/trending-skills --skill openmaic-classroom技能来自 ara.so — Daily 2026 技能集合。
OpenMAIC (开放多智能体互动课堂) 是一个基于 Next.js 16 / React 19 / TypeScript 的平台,能够将任何主题或文档转化为完整的互动课程。一个多智能体流水线 (LangGraph 1.1) 会生成幻灯片、测验、HTML 模拟以及基于项目的学习活动,由具备语音 (TTS) 和白板支持的 AI 教师和 AI 同学进行授课。
| 层级 | 技术 |
|---|---|
| 框架 | Next.js 16 (App Router) |
| UI | React 19, Tailwind CSS 4 |
| 智能体编排 | LangGraph 1.1 |
| 语言 | TypeScript 5 |
| 包管理器 | pnpm >= 10 |
| 运行时 | Node.js >= 20 |
git clone https://github.com/THU-MAIC/OpenMAIC.git
cd OpenMAIC
pnpm install
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
cp .env.example .env.local
编辑 .env.local — 至少需要一个 LLM 供应商密钥:
# LLM 供应商 (至少配置一个)
OPENAI_API_KEY=$OPENAI_API_KEY
ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY
GOOGLE_API_KEY=$GOOGLE_API_KEY
# 推荐的默认模型 (Gemini 3 Flash = 最佳速度/质量平衡)
DEFAULT_MODEL=google:gemini-3-flash-preview
# 可选:MinerU 用于高级 PDF/表格/公式解析
PDF_MINERU_BASE_URL=https://mineru.net
PDF_MINERU_API_KEY=$MINERU_API_KEY
# 可选:托管模式的访问码
ACCESS_CODE=$OPENMAIC_ACCESS_CODE
在项目根目录创建 server-providers.yml:
providers:
openai:
apiKey: $OPENAI_API_KEY
anthropic:
apiKey: $ANTHROPIC_API_KEY
google:
apiKey: $GOOGLE_API_KEY
deepseek:
apiKey: $DEEPSEEK_API_KEY
# 任何 OpenAI 兼容的端点
custom:
baseURL: https://your-proxy.example.com/v1
apiKey: $CUSTOM_API_KEY
# 开发模式
pnpm dev
# → http://localhost:3000
# 生产构建
pnpm build && pnpm start
# 类型检查
pnpm tsc --noEmit
# 代码检查
pnpm lint
cp .env.example .env.local
# 使用你的 API 密钥编辑 .env.local
docker compose up --build
# → http://localhost:3000
# Fork 仓库,然后在 https://vercel.com/new 导入
# 在 Vercel 仪表板中设置环境变量:
# OPENAI_API_KEY 或 ANTHROPIC_API_KEY 或 GOOGLE_API_KEY
# DEFAULT_MODEL (可选,例如 google:gemini-3-flash-preview)
README 中提供了一键部署按钮;它会自动预填环境变量描述。
OpenMAIC 使用两阶段流水线:
| 阶段 | 描述 |
|---|---|
| 大纲 | AI 分析主题/文档并生成结构化的课程大纲 |
| 场景 | 每个大纲项被扩展为一个类型化的场景:slides、quiz、interactive 或 pbl |
| 类型 | 描述 |
|---|---|
slides | AI 教师授课,带有 TTS 旁白、聚光灯、激光笔 |
quiz | 单选/多选或简答题,由 AI 评分 |
interactive | 基于 HTML 的模拟 (物理、流程图等) |
pbl | 基于项目的学习 — 选择一个角色,与智能体协作 |
// POST /api/generate
const response = await fetch('/api/generate', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
topic: 'Quantum Entanglement',
// 可选:附加文档内容
document: markdownString,
// 可选:模型覆盖
model: 'google:gemini-3-flash-preview',
}),
});
const { jobId } = await response.json();
// GET /api/generate/status?jobId=<jobId>
const poll = async (jobId: string) => {
while (true) {
const res = await fetch(`/api/generate/status?jobId=${jobId}`);
const data = await res.json();
if (data.status === 'completed') {
console.log('Classroom URL:', data.classroomUrl);
break;
}
if (data.status === 'failed') {
throw new Error(data.error);
}
// status === 'pending' | 'running'
await new Promise(r => setTimeout(r, 3000));
}
};
// GET /api/export/pptx?classroomId=<id>
const exportPptx = async (classroomId: string) => {
const res = await fetch(`/api/export/pptx?classroomId=${classroomId}`);
const blob = await res.blob();
const url = URL.createObjectURL(blob);
// 触发下载
const a = document.createElement('a');
a.href = url;
a.download = 'lesson.pptx';
a.click();
};
// GET /api/export/html?classroomId=<id>
const exportHtml = async (classroomId: string) => {
const res = await fetch(`/api/export/html?classroomId=${classroomId}`);
const html = await res.text();
return html;
};
OpenMAIC 为 OpenClaw 提供了一个技能,支持从飞书、Slack、Discord、Telegram 等平台生成课堂。
# 通过 ClawHub (推荐)
clawhub install openmaic
# 手动安装
mkdir -p ~/.openclaw/skills
cp -R /path/to/OpenMAIC/skills/openmaic ~/.openclaw/skills/openmaic
编辑 ~/.openclaw/openclaw.json:
{
"skills": {
"entries": {
"openmaic": {
"config": {
// 托管模式 — 从 https://open.maic.chat/ 获取访问码
"accessCode": "$OPENMAIC_ACCESS_CODE",
// 自托管模式 — 本地仓库 + 服务器 URL
"repoDir": "/path/to/OpenMAIC",
"url": "http://localhost:3000"
}
}
}
}
}
| 阶段 | 发生的情况 |
|---|---|
| 克隆 | 检测现有检出或全新克隆 |
| 启动 | 选择 pnpm dev、pnpm build && pnpm start 或 Docker |
| 供应商密钥 | 引导用户编辑 .env.local |
| 生成 | 提交异步任务,轮询,返回课堂链接 |
场景是类型化的 React 组件。要添加新的场景类型:
// types/scene.ts
export type SceneType = 'slides' | 'quiz' | 'interactive' | 'pbl' | 'custom';
export interface CustomScene {
type: 'custom';
title: string;
content: string;
// 你的字段
metadata: Record<string, unknown>;
}
// components/scenes/CustomScene.tsx
'use client';
import { type CustomScene } from '@/types/scene';
interface Props {
scene: CustomScene;
onComplete: () => void;
}
export function CustomSceneComponent({ scene, onComplete }: Props) {
return (
<div className="flex flex-col gap-4 p-6">
<h2 className="text-2xl font-bold">{scene.title}</h2>
<div dangerouslySetInnerHTML={{ __html: scene.content }} />
<button
className="mt-4 rounded-lg bg-blue-600 px-6 py-2 text-white"
onClick={onComplete}
>
Continue
</button>
</div>
);
}
| 模式 | 触发条件 | 描述 |
|---|---|---|
| 课堂讨论 | 自动 | 智能体主动发起讨论;用户可以加入或被点名 |
| 圆桌辩论 | 场景配置 | 多个智能体角色就一个主题进行辩论,并配有白板图示 |
| 问答 | 用户提问 | AI 教师用幻灯片、图表或白板绘图进行回应 |
| 白板 | 在任何场景中 | 智能体实时绘制方程、流程图或概念图 |
针对包含表格、公式或扫描图像的复杂 PDF:
# 使用 MinerU 托管 API
PDF_MINERU_BASE_URL=https://mineru.net
PDF_MINERU_API_KEY=$MINERU_API_KEY
# 或自托管的 MinerU 实例 (Docker)
PDF_MINERU_BASE_URL=http://localhost:8888
如果没有 MinerU,OpenMAIC 将回退到标准的 PDF 文本提取。
// 模型字符串格式: "provider:model-name"
const models = {
// Google (推荐)
geminiFlash: 'google:gemini-3-flash-preview', // 最佳速度/质量
geminiPro: 'google:gemini-3.1-pro', // 最高质量
// OpenAI
gpt4o: 'openai:gpt-4o',
gpt4oMini: 'openai:gpt-4o-mini',
// Anthropic
claude4Sonnet: 'anthropic:claude-sonnet-4-5',
claude4Haiku: 'anthropic:claude-haiku-4-5',
// DeepSeek
deepseekChat: 'deepseek:deepseek-chat',
// OpenAI 兼容 (自定义基础 URL)
custom: 'custom:your-model-name',
};
| 格式 | 端点 | 备注 |
|---|---|---|
PowerPoint .pptx | GET /api/export/pptx?classroomId= | 完全可编辑的幻灯片 |
交互式 .html | GET /api/export/html?classroomId= | 自包含的 HTML 页面 |
const generateFromDocument = async (markdownContent: string, topic: string) => {
const res = await fetch('/api/generate', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
topic,
document: markdownContent,
model: process.env.DEFAULT_MODEL ?? 'google:gemini-3-flash-preview',
}),
});
const { jobId } = await res.json();
// 轮询直到完成
let classroomUrl: string | null = null;
while (!classroomUrl) {
await new Promise(r => setTimeout(r, 4000));
const status = await fetch(`/api/generate/status?jobId=${jobId}`).then(r => r.json());
if (status.status === 'completed') classroomUrl = status.classroomUrl;
if (status.status === 'failed') throw new Error(status.error);
}
return classroomUrl;
};
// GET /api/providers
const checkProviders = async () => {
const res = await fetch('/api/providers');
const { providers } = await res.json();
// providers: Array<{ name: string; available: boolean; models: string[] }>
return providers.filter((p: { available: boolean }) => p.available);
};
| 问题 | 解决方案 |
|---|---|
未配置 LLM 供应商 | 在 .env.local 中至少设置 OPENAI_API_KEY、ANTHROPIC_API_KEY 或 GOOGLE_API_KEY 中的一个 |
| 生成在大纲阶段卡住 | 检查 API 密钥配额;尝试切换到 google:gemini-3-flash-preview 以获得更高的速率限制 |
| TTS 不工作 | TTS 需要支持 Web Speech API 的浏览器;检查浏览器控制台是否有错误 |
| PDF 解析产生乱码 | 通过在 .env.local 中设置 PDF_MINERU_BASE_URL 来启用 MinerU |
| Vercel 在生成期间超时 | 在 vercel.json 中增加函数超时时间;生成是异步的,因此 API 应立即返回 jobId |
| Docker 构建失败 | 确保 DOCKER_BUILDKIT=1 并且在运行 docker compose up --build 之前 .env.local 已存在 |
| 未找到 OpenClaw 技能 | 运行 clawhub install openmaic 或手动将 skills/openmaic 复制到 ~/.openclaw/skills/ |
在 Node < 20 上 pnpm install 失败 | 将 Node.js 升级到 >= 20 (nvm use 20) |
| 端口 3000 已被占用 | 在 .env.local 中设置 PORT=3001 或运行 PORT=3001 pnpm dev |
OpenMAIC/
├── app/ # Next.js App Router 页面和 API 路由
│ ├── api/
│ │ ├── generate/ # POST 课程生成, GET 状态
│ │ ├── export/ # pptx / html 导出端点
│ │ └── providers/ # LLM 供应商健康检查
│ └── classroom/ # 课堂查看器页面
├── components/
│ ├── scenes/ # 幻灯片、测验、交互式、PBL 组件
│ ├── whiteboard/ # 实时白板渲染
│ └── agents/ # 智能体头像和 TTS 组件
├── lib/
│ ├── agents/ # LangGraph 智能体图定义
│ ├── providers/ # LLM 供应商抽象层
│ └── generation/ # 大纲 + 场景生成流水线
├── skills/
│ └── openmaic/ # OpenClaw 技能定义
├── server-providers.yml # 可选的 YAML 供应商配置
├── .env.example # 环境变量模板
└── docker-compose.yml # Docker 部署配置
每周安装量
341
仓库
GitHub 星标数
10
首次出现
7 天前
安全审计
安装于
gemini-cli340
github-copilot340
amp340
cline340
codex340
kimi-cli340
Skill by ara.so — Daily 2026 Skills collection.
OpenMAIC (Open Multi-Agent Interactive Classroom) is a Next.js 16 / React 19 / TypeScript platform that converts any topic or document into a full interactive lesson. A multi-agent pipeline (LangGraph 1.1) generates slides, quizzes, HTML simulations, and project-based learning activities delivered by AI teachers and AI classmates with voice (TTS) and whiteboard support.
| Layer | Technology |
|---|---|
| Framework | Next.js 16 (App Router) |
| UI | React 19, Tailwind CSS 4 |
| Agent orchestration | LangGraph 1.1 |
| Language | TypeScript 5 |
| Package manager | pnpm >= 10 |
| Runtime | Node.js >= 20 |
git clone https://github.com/THU-MAIC/OpenMAIC.git
cd OpenMAIC
pnpm install
cp .env.example .env.local
Edit .env.local — at minimum one LLM provider key is required:
# LLM Providers (configure at least one)
OPENAI_API_KEY=$OPENAI_API_KEY
ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY
GOOGLE_API_KEY=$GOOGLE_API_KEY
# Recommended default model (Gemini 3 Flash = best speed/quality balance)
DEFAULT_MODEL=google:gemini-3-flash-preview
# Optional: MinerU for advanced PDF/table/formula parsing
PDF_MINERU_BASE_URL=https://mineru.net
PDF_MINERU_API_KEY=$MINERU_API_KEY
# Optional: access code for hosted mode
ACCESS_CODE=$OPENMAIC_ACCESS_CODE
Create server-providers.yml in the project root:
providers:
openai:
apiKey: $OPENAI_API_KEY
anthropic:
apiKey: $ANTHROPIC_API_KEY
google:
apiKey: $GOOGLE_API_KEY
deepseek:
apiKey: $DEEPSEEK_API_KEY
# Any OpenAI-compatible endpoint
custom:
baseURL: https://your-proxy.example.com/v1
apiKey: $CUSTOM_API_KEY
# Development
pnpm dev
# → http://localhost:3000
# Production build
pnpm build && pnpm start
# Type checking
pnpm tsc --noEmit
# Linting
pnpm lint
cp .env.example .env.local
# Edit .env.local with your API keys
docker compose up --build
# → http://localhost:3000
# Fork the repo, then import at https://vercel.com/new
# Set env vars in Vercel dashboard:
# OPENAI_API_KEY or ANTHROPIC_API_KEY or GOOGLE_API_KEY
# DEFAULT_MODEL (optional, e.g. google:gemini-3-flash-preview)
One-click deploy button is available in the README; it pre-fills env var descriptions automatically.
OpenMAIC uses a two-stage pipeline:
| Stage | Description |
|---|---|
| Outline | AI analyzes topic/document and produces a structured lesson outline |
| Scenes | Each outline item is expanded into a typed scene: slides, quiz, interactive, or pbl |
| Type | Description |
|---|---|
slides | AI teacher lectures with TTS narration, spotlight, laser pointer |
quiz | Single/multiple choice or short-answer with AI grading |
interactive | HTML-based simulation (physics, flowcharts, etc.) |
pbl | Project-Based Learning — choose a role, collaborate with agents |
// POST /api/generate
const response = await fetch('/api/generate', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
topic: 'Quantum Entanglement',
// Optional: attach document content
document: markdownString,
// Optional: model override
model: 'google:gemini-3-flash-preview',
}),
});
const { jobId } = await response.json();
// GET /api/generate/status?jobId=<jobId>
const poll = async (jobId: string) => {
while (true) {
const res = await fetch(`/api/generate/status?jobId=${jobId}`);
const data = await res.json();
if (data.status === 'completed') {
console.log('Classroom URL:', data.classroomUrl);
break;
}
if (data.status === 'failed') {
throw new Error(data.error);
}
// status === 'pending' | 'running'
await new Promise(r => setTimeout(r, 3000));
}
};
// GET /api/export/pptx?classroomId=<id>
const exportPptx = async (classroomId: string) => {
const res = await fetch(`/api/export/pptx?classroomId=${classroomId}`);
const blob = await res.blob();
const url = URL.createObjectURL(blob);
// trigger download
const a = document.createElement('a');
a.href = url;
a.download = 'lesson.pptx';
a.click();
};
// GET /api/export/html?classroomId=<id>
const exportHtml = async (classroomId: string) => {
const res = await fetch(`/api/export/html?classroomId=${classroomId}`);
const html = await res.text();
return html;
};
OpenMAIC ships a skill for OpenClaw, enabling classroom generation from Feishu, Slack, Discord, Telegram, etc.
# Via ClawHub (recommended)
clawhub install openmaic
# Manual install
mkdir -p ~/.openclaw/skills
cp -R /path/to/OpenMAIC/skills/openmaic ~/.openclaw/skills/openmaic
Edit ~/.openclaw/openclaw.json:
{
"skills": {
"entries": {
"openmaic": {
"config": {
// Hosted mode — get access code from https://open.maic.chat/
"accessCode": "$OPENMAIC_ACCESS_CODE",
// Self-hosted mode — local repo + server URL
"repoDir": "/path/to/OpenMAIC",
"url": "http://localhost:3000"
}
}
}
}
}
| Phase | What Happens |
|---|---|
| Clone | Detect existing checkout or clone fresh |
| Startup | Choose pnpm dev, pnpm build && pnpm start, or Docker |
| Provider Keys | Guide user to edit .env.local |
| Generation | Submit async job, poll, return classroom link |
Scenes are typed React components. To add a new scene type:
// types/scene.ts
export type SceneType = 'slides' | 'quiz' | 'interactive' | 'pbl' | 'custom';
export interface CustomScene {
type: 'custom';
title: string;
content: string;
// your fields
metadata: Record<string, unknown>;
}
// components/scenes/CustomScene.tsx
'use client';
import { type CustomScene } from '@/types/scene';
interface Props {
scene: CustomScene;
onComplete: () => void;
}
export function CustomSceneComponent({ scene, onComplete }: Props) {
return (
<div className="flex flex-col gap-4 p-6">
<h2 className="text-2xl font-bold">{scene.title}</h2>
<div dangerouslySetInnerHTML={{ __html: scene.content }} />
<button
className="mt-4 rounded-lg bg-blue-600 px-6 py-2 text-white"
onClick={onComplete}
>
Continue
</button>
</div>
);
}
| Mode | Trigger | Description |
|---|---|---|
| Classroom Discussion | Automatic | Agents proactively start discussions; user can jump in or get called on |
| Roundtable Debate | Scene config | Multiple agent personas debate a topic with whiteboard illustrations |
| Q&A | User asks question | AI teacher responds with slides, diagrams, or whiteboard drawings |
| Whiteboard | During any scene | Agents draw equations, flowcharts, or concept diagrams in real time |
For complex PDFs with tables, formulas, or scanned images:
# Use MinerU hosted API
PDF_MINERU_BASE_URL=https://mineru.net
PDF_MINERU_API_KEY=$MINERU_API_KEY
# Or self-hosted MinerU instance (Docker)
PDF_MINERU_BASE_URL=http://localhost:8888
Without MinerU, OpenMAIC falls back to standard PDF text extraction.
// Model string format: "provider:model-name"
const models = {
// Google (recommended)
geminiFlash: 'google:gemini-3-flash-preview', // best speed/quality
geminiPro: 'google:gemini-3.1-pro', // highest quality
// OpenAI
gpt4o: 'openai:gpt-4o',
gpt4oMini: 'openai:gpt-4o-mini',
// Anthropic
claude4Sonnet: 'anthropic:claude-sonnet-4-5',
claude4Haiku: 'anthropic:claude-haiku-4-5',
// DeepSeek
deepseekChat: 'deepseek:deepseek-chat',
// OpenAI-compatible (custom base URL)
custom: 'custom:your-model-name',
};
| Format | Endpoint | Notes |
|---|---|---|
PowerPoint .pptx | GET /api/export/pptx?classroomId= | Fully editable slides |
Interactive .html | GET /api/export/html?classroomId= | Self-contained HTML page |
const generateFromDocument = async (markdownContent: string, topic: string) => {
const res = await fetch('/api/generate', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
topic,
document: markdownContent,
model: process.env.DEFAULT_MODEL ?? 'google:gemini-3-flash-preview',
}),
});
const { jobId } = await res.json();
// Poll until done
let classroomUrl: string | null = null;
while (!classroomUrl) {
await new Promise(r => setTimeout(r, 4000));
const status = await fetch(`/api/generate/status?jobId=${jobId}`).then(r => r.json());
if (status.status === 'completed') classroomUrl = status.classroomUrl;
if (status.status === 'failed') throw new Error(status.error);
}
return classroomUrl;
};
// GET /api/providers
const checkProviders = async () => {
const res = await fetch('/api/providers');
const { providers } = await res.json();
// providers: Array<{ name: string; available: boolean; models: string[] }>
return providers.filter((p: { available: boolean }) => p.available);
};
| Problem | Solution |
|---|---|
No LLM provider configured | Set at least one of OPENAI_API_KEY, ANTHROPIC_API_KEY, or GOOGLE_API_KEY in .env.local |
| Generation hangs at outline stage | Check API key quota; try switching to google:gemini-3-flash-preview for higher rate limits |
| TTS not working | TTS requires a browser with Web Speech API support; check browser console for errors |
| PDF parsing produces garbled text | Enable MinerU by setting PDF_MINERU_BASE_URL in |
OpenMAIC/
├── app/ # Next.js App Router pages & API routes
│ ├── api/
│ │ ├── generate/ # POST lesson generation, GET status
│ │ ├── export/ # pptx / html export endpoints
│ │ └── providers/ # LLM provider health check
│ └── classroom/ # Classroom viewer pages
├── components/
│ ├── scenes/ # Slide, Quiz, Interactive, PBL components
│ ├── whiteboard/ # Real-time whiteboard rendering
│ └── agents/ # Agent avatar & TTS components
├── lib/
│ ├── agents/ # LangGraph agent graph definitions
│ ├── providers/ # LLM provider abstractions
│ └── generation/ # Outline + scene generation pipeline
├── skills/
│ └── openmaic/ # OpenClaw skill definition
├── server-providers.yml # Optional YAML provider config
├── .env.example # Environment variable template
└── docker-compose.yml # Docker deployment config
Weekly Installs
341
Repository
GitHub Stars
10
First Seen
7 days ago
Security Audits
Gen Agent Trust HubPassSocketPassSnykWarn
Installed on
gemini-cli340
github-copilot340
amp340
cline340
codex340
kimi-cli340
超能力技能使用指南:AI助手技能调用优先级与工作流程详解
41,800 周安装
.env.local| Vercel timeout during generation | Increase function timeout in vercel.json; generation is async so the API should return a jobId immediately |
| Docker build fails | Ensure DOCKER_BUILDKIT=1 and that .env.local exists before running docker compose up --build |
| OpenClaw skill not found | Run clawhub install openmaic or manually copy skills/openmaic to ~/.openclaw/skills/ |
pnpm install fails on Node < 20 | Upgrade Node.js to >= 20 (nvm use 20) |
| Port 3000 already in use | Set PORT=3001 in .env.local or run PORT=3001 pnpm dev |