local-llm-ops by bobmatnyc/claude-mpm-skills
npx skills add https://github.com/bobmatnyc/claude-mpm-skills --skill local-llm-ops您的 localLLM 仓库提供了一个完整的本地 LLM 工具链(适用于 Apple Silicon):包含设置脚本、功能丰富的 CLI 聊天启动器、基准测试和诊断工具。操作流程是:安装 Ollama,确保服务正在运行,初始化虚拟环境,拉取模型,然后启动聊天或基准测试。
./setup_chatbot.sh
./chatllm
如果没有模型:
ollama pull mistral
brew install ollamabrew services start ollama./setup_chatbot.shcurl http://localhost:11434/api/version广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
./chatllm (主启动器)./chat 或 ./chat.py (备用启动器)./install_aliases.sh 后可使用 llm、llm-code、llm-fast任务模式:
./chat -t coding -m codellama:70b
./chat -t creative -m llama3.1:70b
./chat -t analytical
基准测试脚本位于 scripts/run_benchmarks.sh:
./scripts/run_benchmarks.sh
此脚本会运行 bench_ollama.py,并使用:
benchmarks/prompts.yamlbenchmarks/models.yaml当设置失败时,运行内置的诊断脚本:
./diagnose.sh
常见修复方法:
./setup_chatbot.shollama 在 PATH 环境变量中ollama pull mistral.venv~/.localllm/ 目录下http://localhost:11434toolchains/universal/infrastructure/docker每周安装量
80
仓库
GitHub 星标数
18
首次出现
Jan 23, 2026
安全审计
安装于
gemini-cli61
codex60
opencode60
claude-code59
cursor57
github-copilot55
Your localLLM repo provides a full local LLM toolchain on Apple Silicon: setup scripts, a rich CLI chat launcher, benchmarks, and diagnostics. The operational path is: install Ollama, ensure the service is running, initialize the venv, pull models, then launch chat or benchmarks.
./setup_chatbot.sh
./chatllm
If no models are present:
ollama pull mistral
brew install ollamabrew services start ollama./setup_chatbot.shcurl http://localhost:11434/api/version./chatllm (primary launcher)./chat or ./chat.py (alternate launchers)./install_aliases.sh then llm, llm-code, llm-fastTask modes:
./chat -t coding -m codellama:70b
./chat -t creative -m llama3.1:70b
./chat -t analytical
Benchmarks are scripted in scripts/run_benchmarks.sh:
./scripts/run_benchmarks.sh
This runs bench_ollama.py with:
benchmarks/prompts.yamlbenchmarks/models.yamlRun the built-in diagnostic script when setup fails:
./diagnose.sh
Common fixes:
./setup_chatbot.shollama is in PATHollama pull mistral.venv~/.localllm/http://localhost:11434toolchains/universal/infrastructure/dockerWeekly Installs
80
Repository
GitHub Stars
18
First Seen
Jan 23, 2026
Security Audits
Gen Agent Trust HubWarnSocketPassSnykPass
Installed on
gemini-cli61
codex60
opencode60
claude-code59
cursor57
github-copilot55