news-aggregator-skill by cclank/news-aggregator-skill
npx skills add https://github.com/cclank/news-aggregator-skill --skill news-aggregator-skill从 28 个来源获取实时热点新闻,生成深度中文分析报告。
每个新闻请求都遵循相同的工作流程,无论来源或组合如何:
# 单一来源
python3 scripts/fetch_news.py --source <source_key> --no-save
# 多个来源(逗号分隔)
python3 scripts/fetch_news.py --source hackernews,github,wallstreetcn --no-save
# 所有来源(广泛扫描)
python3 scripts/fetch_news.py --source all --limit 15 --deep --no-save
# 带关键词过滤(自动扩展:"AI" → "AI,LLM,GPT,Claude,Agent,RAG")
python3 scripts/fetch_news.py --source hackernews --keyword "AI,LLM,GPT" --deep --no-save
读取输出的 JSON,并使用下面的统一报告模板格式化每一个条目。将所有内容翻译为简体中文。
将报告保存到 reports/YYYY-MM-DD/<source>_report.md,然后向用户显示完整内容。
所有来源都使用此单一模板。 根据数据可用性显示/隐藏可选字段。
#### N. [标题 (中文翻译)](https://original-url.com)
- **来源**: 源名 | **时间**: 时间 | **热度**: 🔥 热度值
- **链接**: [讨论](hn_url) | [GitHub](gh_url) ← 仅在数据存在时显示
- **摘要**: 一句话中文摘要。
- **深度分析**: 💡 **洞察**: 深度分析(背景、影响、技术价值)。
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
仅列出与通用模板的差异:
| 来源 | 适配 |
|---|---|
| Hacker News | 必须包含 [讨论](hn_url) 链接 |
| GitHub | 热度使用 🌟 Stars,添加 Lang 字段,在深度分析中添加 #Tags |
| Hugging Face | 热度使用 🔥 +N 点赞数,如果存在则包含 [GitHub](url) 链接,撰写深度解读(不仅仅是翻译摘要) |
| 保留精确的热度文本(例如 "108万") |
| 参数 | 描述 | 默认值 |
|---|---|---|
--source | 来源键(一个或多个,逗号分隔)。见下表。 | all |
--limit | 每个来源的最大条目数 | 15 |
--keyword | 逗号分隔的关键词过滤器 | None |
--deep | 下载文章文本以进行更丰富的分析 | 关闭 |
--save | 强制保存到报告目录 | 单一来源时自动 |
--outdir | 自定义输出目录 | reports/YYYY-MM-DD/ |
| 类别 | 键 | 名称 |
|---|---|---|
| 全球新闻 | hackernews | Hacker News |
36kr | 36氪 | |
wallstreetcn | 华尔街见闻 | |
tencent | 腾讯新闻 | |
weibo | 微博热搜 | |
v2ex | V2EX | |
producthunt | Product Hunt | |
github | GitHub Trending | |
| AI/科技 | huggingface | HF Daily Papers |
ai_newsletters | All AI Newsletters (聚合) | |
bensbites | Ben's Bites | |
interconnects | Interconnects (Nathan Lambert) | |
oneusefulthing | One Useful Thing (Ethan Mollick) | |
chinai | ChinAI (Jeffrey Ding) | |
memia | Memia | |
aitoroi | AI to ROI | |
kdnuggets | KDnuggets | |
| 播客 | podcasts | All Podcasts (聚合) |
lexfridman | Lex Fridman | |
80000hours | 80,000 Hours | |
latentspace | Latent Space | |
| 文章 | essays | All Essays (聚合) |
paulgraham | Paul Graham | |
waitbutwhy | Wait But Why | |
jamesclear | James Clear | |
farnamstreet | Farnam Street | |
scottyoung | Scott Young | |
dankoe | Dan Koe |
预配置的多来源简报方案:
python3 scripts/daily_briefing.py --profile <profile>
| 简报方案 | 来源 | 指令文件 |
|---|---|---|
general | HN, 36Kr, GitHub, Weibo, PH, WallStreetCN | instructions/briefing_general.md |
finance | WallStreetCN, 36Kr, Tencent | instructions/briefing_finance.md |
tech | GitHub, HN, Product Hunt | instructions/briefing_tech.md |
social | Weibo, V2EX, Tencent | instructions/briefing_social.md |
ai_daily | HF Papers, AI Newsletters | instructions/briefing_ai_daily.md |
reading_list | Essays, Podcasts | (使用通用模板) |
工作流程 : 执行脚本 → 读取相应的指令文件 → 遵循指令文件和通用模板生成报告。
"AI,LLM,GPT,Claude,Agent,RAG,DeepSeek"。其他领域类似扩展。reports/YYYY-MM-DD/。当用户说出 "如意如意" 或询问 "menu/help" 时:
templates.mdpip install -r requirements.txtplaywright install chromium每周安装量
2.1K
仓库
GitHub 星标数
745
首次出现
Jan 20, 2026
安全审计
安装于
opencode1.9K
gemini-cli1.8K
codex1.7K
cursor1.7K
github-copilot1.6K
kimi-cli1.5K
Fetch real-time hot news from 28 sources, generate deep analysis reports in Chinese.
Every news request follows the same workflow, regardless of source or combination:
# Single source
python3 scripts/fetch_news.py --source <source_key> --no-save
# Multiple sources (comma-separated)
python3 scripts/fetch_news.py --source hackernews,github,wallstreetcn --no-save
# All sources (broad scan)
python3 scripts/fetch_news.py --source all --limit 15 --deep --no-save
# With keyword filter (auto-expand: "AI" → "AI,LLM,GPT,Claude,Agent,RAG")
python3 scripts/fetch_news.py --source hackernews --keyword "AI,LLM,GPT" --deep --no-save
Read the output JSON and format every item using the Unified Report Template below. Translate all content to Simplified Chinese.
Save the report to reports/YYYY-MM-DD/<source>_report.md, then display the full content to the user.
All sources use this single template. Show/hide optional fields based on data availability.
#### N. [标题 (中文翻译)](https://original-url.com)
- **Source**: 源名 | **Time**: 时间 | **Heat**: 🔥 热度值
- **Links**: [Discussion](hn_url) | [GitHub](gh_url) ← 仅在数据存在时显示
- **Summary**: 一句话中文摘要。
- **Deep Dive**: 💡 **Insight**: 深度分析(背景、影响、技术价值)。
Only the differences from the universal template:
| Source | Adaptation |
|---|---|
| Hacker News | MUST include [Discussion](hn_url) link |
| GitHub | Use 🌟 Stars for Heat, add Lang field, add #Tags in Deep Dive |
| Hugging Face | Use 🔥 +N upvotes for Heat, include [GitHub](url) if present, write 深度解读 (not just translate abstract) |
| Arg | Description | Default |
|---|---|---|
--source | Source key(s), comma-separated. See table below. | all |
--limit | Max items per source | 15 |
--keyword | Comma-separated keyword filter | None |
--deep | Download article text for richer analysis | Off |
| Category | Key | Name |
|---|---|---|
| Global News | hackernews | Hacker News |
36kr | 36氪 | |
wallstreetcn | 华尔街见闻 | |
tencent | 腾讯新闻 | |
weibo | 微博热搜 | |
Pre-configured multi-source profiles:
python3 scripts/daily_briefing.py --profile <profile>
| Profile | Sources | Instruction File |
|---|---|---|
general | HN, 36Kr, GitHub, Weibo, PH, WallStreetCN | instructions/briefing_general.md |
finance | WallStreetCN, 36Kr, Tencent | instructions/briefing_finance.md |
tech | GitHub, HN, Product Hunt | instructions/briefing_tech.md |
social |
Workflow : Execute script → Read corresponding instruction file → Generate report following both the instruction file AND the universal template.
"AI,LLM,GPT,Claude,Agent,RAG,DeepSeek". Similar expansions for other domains.reports/YYYY-MM-DD/ before displaying.When the user says "如意如意" or asks for "menu/help":
templates.mdpip install -r requirements.txtplaywright install chromiumWeekly Installs
2.1K
Repository
GitHub Stars
745
First Seen
Jan 20, 2026
Security Audits
Gen Agent Trust HubFailSocketPassSnykWarn
Installed on
opencode1.9K
gemini-cli1.8K
codex1.7K
cursor1.7K
github-copilot1.6K
kimi-cli1.5K
99,500 周安装
| Preserve exact heat text (e.g. "108万") |
--save | Force save to reports dir | Auto for single source |
--outdir | Custom output directory | reports/YYYY-MM-DD/ |
v2ex |
| V2EX |
producthunt | Product Hunt |
github | GitHub Trending |
| AI/Tech | huggingface | HF Daily Papers |
ai_newsletters | All AI Newsletters (aggregate) |
bensbites | Ben's Bites |
interconnects | Interconnects (Nathan Lambert) |
oneusefulthing | One Useful Thing (Ethan Mollick) |
chinai | ChinAI (Jeffrey Ding) |
memia | Memia |
aitoroi | AI to ROI |
kdnuggets | KDnuggets |
| Podcasts | podcasts | All Podcasts (aggregate) |
lexfridman | Lex Fridman |
80000hours | 80,000 Hours |
latentspace | Latent Space |
| Essays | essays | All Essays (aggregate) |
paulgraham | Paul Graham |
waitbutwhy | Wait But Why |
jamesclear | James Clear |
farnamstreet | Farnam Street |
scottyoung | Scott Young |
dankoe | Dan Koe |
| Weibo, V2EX, Tencent |
instructions/briefing_social.md |
ai_daily | HF Papers, AI Newsletters | instructions/briefing_ai_daily.md |
reading_list | Essays, Podcasts | (Use universal template) |