You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -80,7 +80,7 @@ As this integration was completed, several features were developed to **extend t
80
80
81
81
## Independent task execution
82
82
83
-
Airflow executes [Tasks](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html) independent of one another: even though downstream and upstream dependencies between tasks exist, the execution of an individual task happens entirely independently of any other task execution (see: [Tasks Relationships](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html#relationships).
83
+
Airflow executes [Tasks](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html) independent of one another: even though downstream and upstream dependencies between tasks exist, the execution of an individual task happens entirely independently of any other task execution (see: [Tasks Relationships](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html#relationships)).
84
84
85
85
In order to work with this constraint, *airflow-dbt-python* runs each dbt command in a **temporary and isolated directory**. Before execution, all the relevant dbt files are copied from supported backends, and after executing the command any artifacts are exported. This ensures dbt can work with any Airflow deployment, including most production deployments as they are usually running [Remote Executors](https://airflow.apache.org/docs/apache-airflow/stable/executor/index.html#executor-types) and do not guarantee any files will be shared by default between tasks, since each task may run in a completely different environment.
86
86
@@ -109,7 +109,7 @@ Each dbt execution produces one or more [JSON artifacts](https://docs.getdbt.com
109
109
110
110
## Use Airflow connections as dbt targets (without a profiles.yml)
111
111
112
-
[Airflow connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html) allow users to manage and store connection information, such as hostname, port, user name, and password, for operators to use when accessing certain applications, like databases. Similarly, a *dbt*`profiles.yml` file stores connection information under each target key. *airflow-dbt-python* bridges the gap between the two and allows you to use connection information stored as an Airflow connection by specifying the connection id as the `target` parameter of any of the *dbt* operators it provides. What's more, if using an Airflow connection, the `profiles.yml` file may be entirely omitted (although keep in mind a `profiles.yml` file contains a configuration block besides target connection information).
112
+
[Airflow connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html) allow users to manage and store connection information, such as hostname, port, username, and password, for operators to use when accessing certain applications, like databases. Similarly, a *dbt*`profiles.yml` file stores connection information under each target key. *airflow-dbt-python* bridges the gap between the two and allows you to use connection information stored as an Airflow connection by specifying the connection id as the `target` parameter of any of the *dbt* operators it provides. What's more, if using an Airflow connection, the `profiles.yml` file may be entirely omitted (although keep in mind a `profiles.yml` file contains a configuration block besides target connection information).
113
113
114
114
See an example DAG [here](examples/airflow_connection_target_dag.py).
115
115
@@ -164,7 +164,7 @@ Currently, the following *dbt* commands are supported:
164
164
165
165
## Examples
166
166
167
-
All example DAGs are tested against against the latest Airflow version. Some changes, like modifying `import` statements or changing types, may be required for them to work in other versions.
167
+
All example DAGs are tested against the latest Airflow version. Some changes, like modifying `import` statements or changing types, may be required for them to work in other versions.
168
168
169
169
```python
170
170
import datetime as dt
@@ -227,4 +227,4 @@ poetry run pytest tests/ -vv
227
227
228
228
# License
229
229
230
-
This project is licensed under the MIT license. See .
230
+
This project is licensed under the MIT license. See [LICENSE](LICENSE).
0 commit comments