diagnose-seo by calm-north/seojuice-skills
npx skills add https://github.com/calm-north/seojuice-skills --skill diagnose-seo用于爬取问题、规范化错误、索引问题和渲染失败的结构化诊断框架。
技术性 SEO 问题分为四类。按此顺序进行诊断——每一层都依赖于前一层正常工作:
按顺序检查以下内容:
[domain]/robots.txt 并审查规则Disallow 规则,这些规则可能阻塞了重要路径Sitemap: 指令是否指向正确的站点地图 URL常见错误:
Disallow: / 阻塞了整个网站(通常来自暂存环境)广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
<lastmod> 日期是否准确且最近?<meta name="robots" content="noindex">X-Robots-Tag: noindex<link rel="canonical">(自引用规范化)规范化诊断清单:
/page 与 /page/)www(example.com 与 www.example.com)?sort=price、?page=1)<title>hreflang 标签是否正确x-default 标签摘要
按层级的发现
对于发现的每个问题:
| 层级 | 问题 | 严重性 | 受影响的页面 | 修复方法 |
|---|---|---|---|---|
| 可爬取性 | robots.txt 阻塞 /blog/ | 关键 | 所有博客页面 | 从 robots.txt 中移除 Disallow: /blog/ |
| 可索引性 | 缺少规范化标签 | 警告 | 15 个页面 | 添加自引用规范化标签 |
| ... | ... | ... | ... | ... |
优先级修复列表
按影响排序:
专业提示: 运行免费的 SEO 审计 进行快速技术检查,使用 坏链检查器 查找死链,以及使用 Robots.txt 生成器 修复爬取指令。SEOKJuice MCP 用户可以运行
/seojuice:site-health获取完整的技术报告,并运行/seojuice:page-audit [domain] [url]深入分析特定页面。
每周安装量
1.1K
仓库
GitHub Stars
1
首次出现
11 天前
安全审计
安装于
claude-code150
codex128
antigravity110
kimi-cli55
opencode55
cline53
Structured diagnostic framework for crawl issues, canonicalization errors, indexation problems, and rendering failures.
Technical SEO problems fall into four categories. Diagnose in this order — each layer depends on the previous one working correctly:
Check these in order:
[domain]/robots.txt and review the rulesDisallow rules blocking important pathsSitemap: directive points to the correct sitemap URLCommon mistakes:
Disallow: / blocking the entire site (often left from staging)<lastmod> dates accurate and recent?<meta name="robots" content="noindex"> on pages that should be indexedX-Robots-Tag: noindex<link rel="canonical"> pointing to itself (self-referencing canonical)Canonical diagnosis checklist:
/page vs /page/)www (example.com vs www.example.com)?sort=price, ?page=1)<title>hreflang tags linking language variantsx-default tagSummary
Findings by Layer
For each issue found:
| Layer | Issue | Severity | Affected Pages | Fix |
|---|---|---|---|---|
| Crawlability | robots.txt blocks /blog/ | Critical | All blog pages | Remove Disallow: /blog/ from robots.txt |
| Indexability | Missing canonical tags | Warning | 15 pages | Add self-referencing canonicals |
| ... | ... | ... | ... | ... |
Priority Fix List
Ordered by impact:
Pro Tip: Run the free SEO Audit for a quick technical check, the Broken Link Checker to find dead links, and the Robots.txt Generator to fix crawl directives. SEOJuice MCP users can run
/seojuice:site-healthfor a full technical report and/seojuice:page-audit [domain] [url]to drill into specific pages.
Weekly Installs
1.1K
Repository
GitHub Stars
1
First Seen
11 days ago
Security Audits
Gen Agent Trust HubPassSocketPassSnykWarn
Installed on
claude-code150
codex128
antigravity110
kimi-cli55
opencode55
cline53
53,600 周安装