setting-up-astro-project by astronomer/agents
npx skills add https://github.com/astronomer/agents --skill setting-up-astro-project此技能帮助您使用 Astro CLI 初始化和配置 Airflow 项目。
要运行本地环境,请参阅 managing-astro-local-env 技能。要编写 DAG,请参阅 authoring-dags 技能。开源替代方案: 如果用户不使用 Astro,请引导他们使用 Apache Airflow 的 Docker Compose 快速入门进行本地开发,并使用 Helm chart 进行生产部署。关于部署策略,请使用
deploying-airflow技能。
astro dev init
创建以下结构:
project/
├── dags/ # DAG 文件
├── include/ # SQL、配置文件、支持文件
├── plugins/ # 自定义 Airflow 插件
├── tests/ # 单元测试
├── Dockerfile # 镜像自定义
├── packages.txt # 操作系统级包
├── requirements.txt # Python 包
└── airflow_settings.yaml # 连接、变量、池
apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
gcc
libpq-dev
适用于复杂设置(私有 PyPI、自定义脚本):
FROM quay.io/astronomer/astro-runtime:12.4.0
RUN pip install --extra-index-url https://pypi.example.com/simple my-package
修改依赖项后: 运行 astro dev restart
环境启动时自动加载:
airflow:
connections:
- conn_id: my_postgres
conn_type: postgres
host: host.docker.internal
port: 5432
login: user
password: pass
schema: mydb
variables:
- variable_name: env
variable_value: dev
pools:
- pool_name: limited_pool
pool_slot: 5
# 从运行环境中导出
astro dev object export --connections --file connections.yaml
# 导入到环境
astro dev object import --connections --file connections.yaml
在不启动完整环境的情况下解析 DAG 以捕获错误:
astro dev parse
每周安装数
366
代码仓库
GitHub 星标数
269
首次出现
2026年1月23日
安全审计
安装于
opencode270
codex263
github-copilot262
cursor261
gemini-cli249
claude-code233
This skill helps you initialize and configure Airflow projects using the Astro CLI.
To run the local environment , see the managing-astro-local-env skill. To write DAGs , see the authoring-dags skill. Open-source alternative: If the user isn't on Astro, guide them to Apache Airflow's Docker Compose quickstart for local dev and the Helm chart for production. For deployment strategies, use the
deploying-airflowskill.
astro dev init
Creates this structure:
project/
├── dags/ # DAG files
├── include/ # SQL, configs, supporting files
├── plugins/ # Custom Airflow plugins
├── tests/ # Unit tests
├── Dockerfile # Image customization
├── packages.txt # OS-level packages
├── requirements.txt # Python packages
└── airflow_settings.yaml # Connections, variables, pools
apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0
gcc
libpq-dev
For complex setups (private PyPI, custom scripts):
FROM quay.io/astronomer/astro-runtime:12.4.0
RUN pip install --extra-index-url https://pypi.example.com/simple my-package
After modifying dependencies: Run astro dev restart
Loaded automatically on environment start:
airflow:
connections:
- conn_id: my_postgres
conn_type: postgres
host: host.docker.internal
port: 5432
login: user
password: pass
schema: mydb
variables:
- variable_name: env
variable_value: dev
pools:
- pool_name: limited_pool
pool_slot: 5
# Export from running environment
astro dev object export --connections --file connections.yaml
# Import to environment
astro dev object import --connections --file connections.yaml
Parse DAGs to catch errors without starting the full environment:
astro dev parse
Weekly Installs
366
Repository
GitHub Stars
269
First Seen
Jan 23, 2026
Security Audits
Gen Agent Trust HubPassSocketWarnSnykPass
Installed on
opencode270
codex263
github-copilot262
cursor261
gemini-cli249
claude-code233