Skip to content

Commit 0153f4a

Browse files
blackbass64tomasfarias
authored andcommitted
fix: Typo and code format
1 parent 2aabf0c commit 0153f4a

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ As this integration was completed, several features were developed to **extend t
8080

8181
## Independent task execution
8282

83-
Airflow executes [Tasks](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html) independent of one another: even though downstream and upstream dependencies between tasks exist, the execution of an individual task happens entirely independently of any other task execution (see: [Tasks Relationships](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html#relationships).
83+
Airflow executes [Tasks](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html) independent of one another: even though downstream and upstream dependencies between tasks exist, the execution of an individual task happens entirely independently of any other task execution (see: [Tasks Relationships](https://airflow.apache.org/docs/apache-airflow/stable/concepts/tasks.html#relationships)).
8484

8585
In order to work with this constraint, *airflow-dbt-python* runs each dbt command in a **temporary and isolated directory**. Before execution, all the relevant dbt files are copied from supported backends, and after executing the command any artifacts are exported. This ensures dbt can work with any Airflow deployment, including most production deployments as they are usually running [Remote Executors](https://airflow.apache.org/docs/apache-airflow/stable/executor/index.html#executor-types) and do not guarantee any files will be shared by default between tasks, since each task may run in a completely different environment.
8686

@@ -109,7 +109,7 @@ Each dbt execution produces one or more [JSON artifacts](https://docs.getdbt.com
109109

110110
## Use Airflow connections as dbt targets (without a profiles.yml)
111111

112-
[Airflow connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html) allow users to manage and store connection information, such as hostname, port, user name, and password, for operators to use when accessing certain applications, like databases. Similarly, a *dbt* `profiles.yml` file stores connection information under each target key. *airflow-dbt-python* bridges the gap between the two and allows you to use connection information stored as an Airflow connection by specifying the connection id as the `target` parameter of any of the *dbt* operators it provides. What's more, if using an Airflow connection, the `profiles.yml` file may be entirely omitted (although keep in mind a `profiles.yml` file contains a configuration block besides target connection information).
112+
[Airflow connections](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html) allow users to manage and store connection information, such as hostname, port, username, and password, for operators to use when accessing certain applications, like databases. Similarly, a *dbt* `profiles.yml` file stores connection information under each target key. *airflow-dbt-python* bridges the gap between the two and allows you to use connection information stored as an Airflow connection by specifying the connection id as the `target` parameter of any of the *dbt* operators it provides. What's more, if using an Airflow connection, the `profiles.yml` file may be entirely omitted (although keep in mind a `profiles.yml` file contains a configuration block besides target connection information).
113113

114114
See an example DAG [here](examples/airflow_connection_target_dag.py).
115115

@@ -164,7 +164,7 @@ Currently, the following *dbt* commands are supported:
164164

165165
## Examples
166166

167-
All example DAGs are tested against against the latest Airflow version. Some changes, like modifying `import` statements or changing types, may be required for them to work in other versions.
167+
All example DAGs are tested against the latest Airflow version. Some changes, like modifying `import` statements or changing types, may be required for them to work in other versions.
168168

169169
``` python
170170
import datetime as dt
@@ -227,4 +227,4 @@ poetry run pytest tests/ -vv
227227

228228
# License
229229

230-
This project is licensed under the MIT license. See ![LICENSE](LICENSE).
230+
This project is licensed under the MIT license. See [LICENSE](LICENSE).

0 commit comments

Comments
 (0)