cosmos-dbt-fusion by astronomer/agents
npx skills add https://github.com/astronomer/agents --skill cosmos-dbt-fusion按顺序执行步骤。此技能仅涵盖 Fusion 特有的约束。
版本说明:dbt Fusion 支持在 Cosmos 1.11.0 中引入。要求 Cosmos ≥ 1.11。
参考:有关 ProfileConfig、operator_args 和 Airflow 3 兼容性详情,请参阅 reference/cosmos-config.md。
开始前,请确认:(1) dbt 引擎 = Fusion(不是 Core → 使用 cosmos-dbt-core),(2) 数据仓库 = 仅限 Snowflake、Databricks、Bigquery 和 Redshift。
| 约束 | 详情 |
|---|---|
| 不支持异步 | 不支持 AIRFLOW_ASYNC |
| 不支持虚拟环境 | Fusion 是二进制文件,不是 Python 包 |
| 数据仓库支持 | 仅支持 Snowflake、Databricks、Bigquery 和 Redshift 预览期间 |
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
关键:Cosmos 1.11.0 引入了 dbt Fusion 兼容性。
# 检查已安装版本
pip show astronomer-cosmos
# 如果需要,安装/升级
pip install "astronomer-cosmos>=1.11.0"
验证:pip show astronomer-cosmos 报告的版本 ≥ 1.11.0
dbt Fusion 不随 Cosmos 或 dbt Core 捆绑。需将其安装到 Airflow 运行时/镜像中。
确定将 Fusion 二进制文件安装到哪里(Dockerfile / 基础镜像 / 运行时)。
USER root
RUN apt-get update && apt-get install -y curl
ENV SHELL=/bin/bash
RUN curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh -s -- --update
USER astro
| 环境 | 典型路径 |
|---|---|
| Astro Runtime | /home/astro/.local/bin/dbt |
| 系统全局 | /usr/local/bin/dbt |
验证:所选路径存在 dbt 二进制文件,且 dbt --version 执行成功。
解析策略与 dbt Core 相同。选择一种:
| 加载模式 | 使用时机 | 必需输入 |
|---|---|---|
dbt_manifest | 大型项目;解析最快 | ProjectConfig.manifest_path |
dbt_ls | 复杂选择器;需要 dbt 原生选择 | 调度器可访问 Fusion 二进制文件 |
automatic | 简单设置;让 Cosmos 选择 | (无) |
from cosmos import RenderConfig, LoadMode
_render_config = RenderConfig(
load_method=LoadMode.AUTOMATIC, # 或 DBT_MANIFEST, DBT_LS
)
参考:完整的 ProfileConfig 选项和示例,请参阅 reference/cosmos-config.md。
from cosmos import ProfileConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
关键:Cosmos 使用 dbt Fusion 需要
ExecutionMode.LOCAL,且dbt_executable_path需指向 Fusion 二进制文件。
from cosmos import ExecutionConfig
from cosmos.constants import InvocationMode
_execution_config = ExecutionConfig(
invocation_mode=InvocationMode.SUBPROCESS,
dbt_executable_path="/home/astro/.local/bin/dbt", # 必需:指向 Fusion 二进制文件的路径
# execution_mode 默认为 LOCAL - 请勿更改
)
from cosmos import ProjectConfig
_project_config = ProjectConfig(
dbt_project_path="/path/to/dbt/project",
# manifest_path="/path/to/manifest.json", # 用于 dbt_manifest 加载模式
# install_dbt_deps=False, # 如果依赖项已在 CI 中预计算
)
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig, RenderConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
from pendulum import datetime
_project_config = ProjectConfig(
dbt_project_path="/usr/local/airflow/dbt/my_project",
)
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
_execution_config = ExecutionConfig(
dbt_executable_path="/home/astro/.local/bin/dbt", # Fusion 二进制文件
)
_render_config = RenderConfig()
my_fusion_dag = DbtDag(
dag_id="my_fusion_cosmos_dag",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
render_config=_render_config,
start_date=datetime(2025, 1, 1),
schedule="@daily",
)
from airflow.sdk import dag, task # Airflow 3.x
# from airflow.decorators import dag, task # Airflow 2.x
from airflow.models.baseoperator import chain
from cosmos import DbtTaskGroup, ProjectConfig, ProfileConfig, ExecutionConfig
from pendulum import datetime
_project_config = ProjectConfig(dbt_project_path="/usr/local/airflow/dbt/my_project")
_profile_config = ProfileConfig(profile_name="default", target_name="dev")
_execution_config = ExecutionConfig(dbt_executable_path="/home/astro/.local/bin/dbt")
@dag(start_date=datetime(2025, 1, 1), schedule="@daily")
def my_dag():
@task
def pre_dbt():
return "some_value"
dbt = DbtTaskGroup(
group_id="dbt_fusion_project",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
)
@task
def post_dbt():
pass
chain(pre_dbt(), dbt, post_dbt())
my_dag()
最终确定前,请验证:
如果用户报告启用 Fusion 后出现 dbt Core 回归问题:
AIRFLOW__COSMOS__PRE_DBT_FUSION=1
每周安装数
294
代码仓库
GitHub 星标数
269
首次出现
2026年2月4日
安全审计
安装于
github-copilot232
opencode231
gemini-cli225
codex223
cursor220
amp217
Execute steps in order. This skill covers Fusion-specific constraints only.
Version note : dbt Fusion support was introduced in Cosmos 1.11.0. Requires Cosmos ≥1.11.
Reference : See reference/cosmos-config.md for ProfileConfig, operator_args, and Airflow 3 compatibility details.
Before starting , confirm: (1) dbt engine = Fusion (not Core → use cosmos-dbt-core), (2) warehouse = Snowflake, Databricks, Bigquery and Redshift only.
| Constraint | Details |
|---|---|
| No async | AIRFLOW_ASYNC not supported |
| No virtualenv | Fusion is a binary, not a Python package |
| Warehouse support | Snowflake, Databricks, Bigquery and Redshift support while in preview |
CRITICAL : Cosmos 1.11.0 introduced dbt Fusion compatibility.
# Check installed version
pip show astronomer-cosmos
# Install/upgrade if needed
pip install "astronomer-cosmos>=1.11.0"
Validate : pip show astronomer-cosmos reports version ≥ 1.11.0
dbt Fusion is NOT bundled with Cosmos or dbt Core. Install it into the Airflow runtime/image.
Determine where to install the Fusion binary (Dockerfile / base image / runtime).
USER root
RUN apt-get update && apt-get install -y curl
ENV SHELL=/bin/bash
RUN curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh -s -- --update
USER astro
| Environment | Typical path |
|---|---|
| Astro Runtime | /home/astro/.local/bin/dbt |
| System-wide | /usr/local/bin/dbt |
Validate : The dbt binary exists at the chosen path and dbt --version succeeds.
Parsing strategy is the same as dbt Core. Pick ONE:
| Load mode | When to use | Required inputs |
|---|---|---|
dbt_manifest | Large projects; fastest parsing | ProjectConfig.manifest_path |
dbt_ls | Complex selectors; need dbt-native selection | Fusion binary accessible to scheduler |
automatic | Simple setups; let Cosmos pick | (none) |
from cosmos import RenderConfig, LoadMode
_render_config = RenderConfig(
load_method=LoadMode.AUTOMATIC, # or DBT_MANIFEST, DBT_LS
)
Reference : See reference/cosmos-config.md for full ProfileConfig options and examples.
from cosmos import ProfileConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
CRITICAL : dbt Fusion with Cosmos requires
ExecutionMode.LOCALwithdbt_executable_pathpointing to the Fusion binary.
from cosmos import ExecutionConfig
from cosmos.constants import InvocationMode
_execution_config = ExecutionConfig(
invocation_mode=InvocationMode.SUBPROCESS,
dbt_executable_path="/home/astro/.local/bin/dbt", # REQUIRED: path to Fusion binary
# execution_mode is LOCAL by default - do not change
)
from cosmos import ProjectConfig
_project_config = ProjectConfig(
dbt_project_path="/path/to/dbt/project",
# manifest_path="/path/to/manifest.json", # for dbt_manifest load mode
# install_dbt_deps=False, # if deps precomputed in CI
)
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig, RenderConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
from pendulum import datetime
_project_config = ProjectConfig(
dbt_project_path="/usr/local/airflow/dbt/my_project",
)
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
_execution_config = ExecutionConfig(
dbt_executable_path="/home/astro/.local/bin/dbt", # Fusion binary
)
_render_config = RenderConfig()
my_fusion_dag = DbtDag(
dag_id="my_fusion_cosmos_dag",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
render_config=_render_config,
start_date=datetime(2025, 1, 1),
schedule="@daily",
)
from airflow.sdk import dag, task # Airflow 3.x
# from airflow.decorators import dag, task # Airflow 2.x
from airflow.models.baseoperator import chain
from cosmos import DbtTaskGroup, ProjectConfig, ProfileConfig, ExecutionConfig
from pendulum import datetime
_project_config = ProjectConfig(dbt_project_path="/usr/local/airflow/dbt/my_project")
_profile_config = ProfileConfig(profile_name="default", target_name="dev")
_execution_config = ExecutionConfig(dbt_executable_path="/home/astro/.local/bin/dbt")
@dag(start_date=datetime(2025, 1, 1), schedule="@daily")
def my_dag():
@task
def pre_dbt():
return "some_value"
dbt = DbtTaskGroup(
group_id="dbt_fusion_project",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
)
@task
def post_dbt():
pass
chain(pre_dbt(), dbt, post_dbt())
my_dag()
Before finalizing, verify:
If user reports dbt Core regressions after enabling Fusion:
AIRFLOW__COSMOS__PRE_DBT_FUSION=1
Weekly Installs
294
Repository
GitHub Stars
269
First Seen
Feb 4, 2026
Security Audits
Gen Agent Trust HubFailSocketPassSnykWarn
Installed on
github-copilot232
opencode231
gemini-cli225
codex223
cursor220
amp217