重要前提
安装AI Skills的关键前提是:必须科学上网,且开启TUN模式,这一点至关重要,直接决定安装能否顺利完成,在此郑重提醒三遍:科学上网,科学上网,科学上网。查看完整安装教程 →
module-spec-generator by rysweet/amplihack
npx skills add https://github.com/rysweet/amplihack --skill module-spec-generator该技能通过代码分析自动生成全面的模块规范,确保遵循 amplihack 的砖块哲学,并能在不破坏系统连接的情况下实现有效的模块重构。
砖块 = 具有单一明确职责的自包含模块 接口 = 供其他模块连接的公共契约(函数、API、数据模型) 可重构 = 可以根据规范重建,而不会破坏连接
一个好的规范能够独立重建任何模块,同时保留其连接点。
每个模块规范都包含以下部分:
# [模块名称] 规范
## 目的
一句话描述模块的核心职责。
## 范围
本模块处理什么 | 明确不处理什么
## 理念一致性
本模块如何体现砖块原则和简洁性。
## 公共接口
### 函数
- `function_name(param: Type) -> ReturnType`
简要描述其功能。
### 类/数据模型
- `ClassName`
- 字段:带类型的列表
- 关键方法:列表
### 常量/枚举
重要的模块级常量及其用途。
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
## 依赖关系
### 外部依赖
- `library_name` (版本):用途说明
### 内部依赖
- `module_path`:本模块如何依赖它
### 无外部依赖(最佳情况)
纯 Python,仅使用标准库。
## 模块结构
module_name/ ├── init.py # 通过 all 公开接口 ├── core.py # 主要实现 ├── models.py # 数据模型(如果需要) ├── utils.py # 内部工具 ├── tests/ │ ├── init.py │ ├── test_core.py # 主要功能测试 │ ├── test_models.py # 数据模型测试(如果需要) │ └── fixtures/ │ └── sample_data.json └── examples/ └── basic_usage.py # 使用示例
## 测试要求
### 单元测试
- 测试 1:目的和验证内容
- 测试 2:...
### 集成测试(如适用)
- 测试 1:...
### 覆盖率目标
目标测试覆盖率百分比(通常为 85%+)
## 使用示例
```python
from module_name import PublicFunction, DataModel
# 使用示例 1
result = PublicFunction(input_data)
# 使用示例 2
model = DataModel(field1="value", field2=123)
__init__.py 和核心实现)__all__ 或等效方式中的所有导出项User: 我正在创建一个新的身份验证模块。请生成一个确保遵循砖块哲学的规范。
Claude:
User: 为现有的缓存模块生成规范。
Claude:
User: 检查现有的会话管理规范是否准确描述了实现。
Claude:
__all__ 或等效的公共接口# [模块名称] 规范
## 目的
[一句话描述核心职责]
## 范围
**处理**:[本模块做什么]
**不处理**:[明确超出范围的内容]
## 理念一致性
- ✅ 极致简洁:[如何体现这一点]
- ✅ 单一职责:[核心工作]
- ✅ 无外部依赖:[是/否,并说明原因]
- ✅ 可重构:[是,模块可以根据此规范重建]
## 公共接口("接口")
### 函数
```python
def primary_function(param: Type) -> ReturnType:
"""简要描述。
Args:
param: 带约束的描述
Returns:
返回值的描述
"""
class DataModel:
"""职责的简要描述。
Attributes:
field1 (Type): 描述
field2 (Type): 描述
"""
CONSTANT_NAME:描述和用法无 - 纯 Python 标准库
.models:数据结构.utils:共享工具module_name/
├── __init__.py # 通过 __all__ 导出
├── core.py # 实现
├── models.py # 数据模型
├── utils.py # 工具
├── tests/
│ └── test_core.py
└── examples/
└── usage.py
85%+ 行覆盖率
from module_name import primary_function, DataModel
# 基本用法
result = primary_function(input_data)
# 数据模型用法
model = DataModel(field1="value", field2=123)
print(model.field1)
此模块可以根据此规范重建,同时保持:
规范保存至:Specs/[模块名称].md
这将所有模块规范保存在一个集中的、可发现的位置。
规范生成后,构建代理可以:
生成规范后,请验证:
有人能根据此规范重建模块吗?
每个导出的函数都有明确的目的吗?
所有依赖关系都是合理的吗?
这会防止破坏其他模块吗?
一个好的模块规范:
每周安装量
84
仓库
GitHub 星标数
45
首次出现
2026年1月23日
安全审计
安装于
opencode76
claude-code72
codex71
cursor70
gemini-cli69
github-copilot68
This skill automatically generates comprehensive module specifications from code analysis, ensuring adherence to amplihack's brick philosophy and enabling effective module regeneration without breaking system connections.
Brick = Self-contained module with ONE clear responsibility Stud = Public contract (functions, API, data models) others connect to Regeneratable = Can be rebuilt from specification without breaking connections
A good spec enables rebuilding ANY module independently while preserving its connection points.
Every module specification includes these sections:
# [Module Name] Specification
## Purpose
One-sentence description of the module's core responsibility.
## Scope
What this module handles | What it explicitly does NOT handle
## Philosophy Alignment
How this module embodies brick principles and simplicity.
## Public Interface
### Functions
- `function_name(param: Type) -> ReturnType`
Brief description of what it does.
### Classes/Data Models
- `ClassName`
- Fields: list with types
- Key methods: list
### Constants/Enums
Important module-level constants and their purposes.
## Dependencies
### External Dependencies
- `library_name` (version): What it's used for
### Internal Dependencies
- `module_path`: How this module depends on it
### NO External Dependencies (Best Case)
Pure Python, standard library only.
## Module Structure
module_name/ ├── init.py # Public interface via all ├── core.py # Main implementation ├── models.py # Data models (if needed) ├── utils.py # Internal utilities ├── tests/ │ ├── init.py │ ├── test_core.py # Main functionality tests │ ├── test_models.py # Data model tests (if needed) │ └── fixtures/ │ └── sample_data.json └── examples/ └── basic_usage.py # Usage examples
## Test Requirements
### Unit Tests
- Test 1: Purpose and what it verifies
- Test 2: ...
### Integration Tests (if applicable)
- Test 1: ...
### Coverage Goal
Target test coverage percentage (typically 85%+)
## Example Usage
```python
from module_name import PublicFunction, DataModel
# Usage example 1
result = PublicFunction(input_data)
# Usage example 2
model = DataModel(field1="value", field2=123)
## Step-by-Step Analysis Process
### Step 1: Understand the Module
1. Read all module files (focus on `__init__.py` and core implementations)
2. Identify the single core responsibility
3. Note architectural patterns used (classes, functions, mixins, etc.)
### Step 2: Extract Public Contract
1. List all exports in `__all__` or equivalent
2. Document function signatures with full type hints
3. Identify data structures (classes, NamedTuple, dataclass)
4. Extract constants and their meanings
5. Include docstrings for each public item
### Step 3: Map Dependencies
1. Scan imports at module level
2. Categorize:
- Standard library (good - include version constraints)
- External packages (list version requirements)
- Internal modules (note the module path)
3. Identify circular dependencies (red flag)
### Step 4: Analyze Module Structure
1. Map file organization
2. Identify what goes in each file
3. Note test fixtures and examples
### Step 5: Identify Test Requirements
1. What behaviors MUST be tested
2. What edge cases exist
3. What integration points need coverage
4. Suggest coverage target
### Step 6: Generate Spec Document
1. Create Specs/[module-name].md
2. Fill in all sections using analysis
3. Include example code
4. Verify spec allows module regeneration
## Usage Examples
### Example 1: Generate Spec for New Module
User: I'm creating a new authentication module. Generate a spec that ensures it follows brick philosophy.
Claude:
### Example 2: Document Existing Module
User: Generate a spec for the existing caching module.
Claude:
### Example 3: Verify Module Spec Accuracy
User: Check if the existing session management spec accurately describes the implementation.
Claude:
## Analysis Checklist
### Code Analysis
- [ ] Read all Python files in module
- [ ] Identify `__all__` or equivalent public interface
- [ ] Extract all public function signatures
- [ ] Document all public classes with fields and methods
- [ ] List module-level constants
- [ ] Map all imports (external and internal)
### Philosophy Verification
- [ ] Single clear responsibility
- [ ] No unnecessary abstractions
- [ ] Public interface clear and minimal
- [ ] Dependencies are justified
- [ ] No external dependencies (if possible)
- [ ] Patterns align with amplihack principles
### Specification Quality
- [ ] Spec is complete and precise
- [ ] Code examples are accurate and working
- [ ] Test requirements are realistic
- [ ] Module structure is clear
- [ ] Someone could rebuild module from spec
- [ ] Regeneration preserves all connections
## Template for Module Specs
```markdown
# [Module Name] Specification
## Purpose
[Single sentence describing core responsibility]
## Scope
**Handles**: [What this module does]
**Does NOT handle**: [What is explicitly out of scope]
## Philosophy Alignment
- ✅ Ruthless Simplicity: [How it embodies this]
- ✅ Single Responsibility: [Core job]
- ✅ No External Dependencies: [True/False with reason]
- ✅ Regeneratable: [Yes, module can be rebuilt from this spec]
## Public Interface (The "Studs")
### Functions
```python
def primary_function(param: Type) -> ReturnType:
"""Brief description.
Args:
param: Description with constraints
Returns:
Description of return value
"""
class DataModel:
"""Brief description of responsibility.
Attributes:
field1 (Type): Description
field2 (Type): Description
"""
CONSTANT_NAME: Description and usageNone - pure Python standard library
.models: Data structures.utils: Shared utilitiesmodule_name/
├── __init__.py # Exports via __all__
├── core.py # Implementation
├── models.py # Data models
├── utils.py # Utilities
├── tests/
│ └── test_core.py
└── examples/
└── usage.py
85%+ line coverage
from module_name import primary_function, DataModel
# Basic usage
result = primary_function(input_data)
# Data model usage
model = DataModel(field1="value", field2=123)
print(model.field1)
This module can be rebuilt from this specification while maintaining:
✅ Public contract (all "studs" preserved)
✅ Dependencies (same external/internal deps)
✅ Test interface (same test requirements)
✅ Module structure (same file organization)
Specifications are saved to: Specs/[module-name].md
This keeps all module specifications in a central, discoverable location.
After spec generation, the Builder Agent can:
After generating a spec, verify:
Can someone rebuild the module from this spec?
Does every exported function have a clear purpose?
Are all dependencies justified?
Weekly Installs
84
Repository
GitHub Stars
45
First Seen
Jan 23, 2026
Security Audits
Gen Agent Trust HubPassSocketPassSnykPass
Installed on
opencode76
claude-code72
codex71
cursor70
gemini-cli69
github-copilot68
Skills CLI 使用指南:AI Agent 技能包管理器安装与管理教程
48,700 周安装
Figma Code Connect 组件映射指南:连接设计与代码,实现设计与开发一致性
329 周安装
Ansible 角色详解:模块化基础设施自动化代码构建与复用指南
57 周安装
阿里云 SLS 可观测性集成:为 OpenClaw 配置日志采集与监控仪表板
85 周安装
X 转 Markdown 工具:将推文和文章转换为带元数据的 Markdown 文件
96 周安装
Graphic Design Skill for Claude AI - 图形设计工具与AI助手集成
88 周安装
Linux与Windows权限提升方法详解:从低权限到Root/Admin的渗透测试技术
118 周安装
Would this prevent breaking other modules?
A good module spec: