1818 -->
1919
2020# Apache Airflow Python Client
21+
2122# Overview
2223
2324To facilitate management, Apache Airflow supports a range of REST API endpoints across its
@@ -26,6 +27,7 @@ This section provides an overview of the API design, methods, and supported use
2627
2728Most of the endpoints accept ` JSON ` as input and return ` JSON ` responses.
2829This means that you must usually add the following headers to your request:
30+
2931```
3032Content-type: application/json
3133Accept: application/json
@@ -41,7 +43,7 @@ Resource names are used as part of endpoint URLs, as well as in API parameters a
4143
4244## CRUD Operations
4345
44- The platform supports ** C ** reate , ** R ** ead , ** U ** pdate , and ** D ** elete operations on most resources.
46+ The platform supports ** Create ** , ** Read ** , ** Update ** , and ** Delete ** operations on most resources.
4547You can review the standards for these operations and their standard parameters below.
4648
4749Some endpoints have special behavior as exceptions.
@@ -66,6 +68,7 @@ The response usually returns a `200 OK` response code upon success, with an obje
6668of resources' metadata in the response body.
6769
6870When reading resources, some common query parameters are usually available. e.g.:
71+
6972```
7073v1/connections?limit=25&offset=25
7174```
@@ -84,7 +87,7 @@ resource in the response body.
8487
8588### Delete
8689
87- Deleting a resource requires the resource ` id ` and is typically executed via an HTTP ` DELETE ` request.
90+ Deleting a resource requires the resource ` id ` and is typically executing via an HTTP ` DELETE ` request.
8891The response usually returns a ` 204 No Content ` response code upon success.
8992
9093## Conventions
@@ -93,16 +96,15 @@ The response usually returns a `204 No Content` response code upon success.
9396- Names are consistent between URL parameter name and field name.
9497
9598- Field names are in snake_case.
99+
96100``` json
97101{
98- \"description\": \"string\",
99102 \"name\": \"string\",
103+ \"slots\": 0,
100104 \"occupied_slots\": 0,
101- \"open_slots \": 0
105+ \"used_slots \": 0,
102106 \"queued_slots\": 0,
103- \"running_slots\": 0,
104- \"scheduled_slots\": 0,
105- \"slots\": 0,
107+ \"open_slots\": 0
106108}
107109```
108110
@@ -115,10 +117,13 @@ The update request ignores any fields that aren't specified in the field mask, l
115117their current values.
116118
117119Example:
118- ```
119- resource = request.get('/resource/my-id').json()
120- resource['my_field'] = 'new-value'
121- request.patch('/resource/my-id?update_mask=my_field', data=json.dumps(resource))
120+
121+ ``` python
122+ import requests
123+
124+ resource = requests.get(" /resource/my-id" ).json()
125+ resource[" my_field" ] = " new-value"
126+ requests.patch(" /resource/my-id?update_mask=my_field" , data = json.dumps(resource))
122127```
123128
124129## Versioning and Endpoint Lifecycle
@@ -136,6 +141,7 @@ the Apache Airflow API.
136141Note that you will need to pass credentials data.
137142
138143For e.g., here is how to pause a DAG with [ curl] ( https://curl.haxx.se/ ) , when basic authorization is used:
144+
139145``` bash
140146curl -X PATCH ' https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \\
141147-H ' Content-Type: application/json' \\
@@ -148,8 +154,9 @@ curl -X PATCH 'https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \
148154Using a graphical tool such as [ Postman] ( https://www.postman.com/ ) or [ Insomnia] ( https://insomnia.rest/ ) ,
149155it is possible to import the API specifications directly:
150156
151- 1 . Download the API specification by clicking the ** Download** button at the top of this document
157+ 1 . Download the API specification by clicking the ** Download** button at top of this document.
1521582 . Import the JSON specification in the graphical tool of your choice.
159+
153160 - In * Postman* , you can click the ** import** button at the top
154161 - With * Insomnia* , you can just drag-and-drop the file on the UI
155162
@@ -172,10 +179,12 @@ and it is even possible to add your own method.
172179
173180If you want to check which auth backend is currently set, you can use
174181` airflow config get-value api auth_backends ` command as in the example below.
182+
175183``` bash
176184$ airflow config get-value api auth_backends
177185airflow.api.auth.backend.basic_auth
178186```
187+
179188The default is to deny all requests.
180189
181190For details on configuring the authentication, see
@@ -229,43 +238,40 @@ resource, e.g. the resource it tries to create already exists.
229238This means that the server encountered an unexpected condition that prevented it from
230239fulfilling the request.
231240
232-
233241This Python package is automatically generated by the [ OpenAPI Generator] ( https://openapi-generator.tech ) project:
234242
235- - API version: 2.8 .0
236- - Package version: 2.8 .0
243+ - API version: 2.9 .0
244+ - Package version: 2.9 .0
237245- Build package: org.openapitools.codegen.languages.PythonClientCodegen
246+
238247For more information, please visit [ https://airflow.apache.org ] ( https://airflow.apache.org )
239248
240249## Requirements.
241250
242- Python >=3.6
251+ Python >=3.8
243252
244253## Installation & Usage
245- ### pip install
246254
247- If the python package is hosted on a repository, you can install directly using:
255+ ### pip install
248256
249- ``` sh
250- pip install git+https://github.com/apache/airflow-client-python.git
251- ```
252- (you may need to run ` pip ` with root permission: ` sudo pip install git+https://github.com/apache/airflow-client-python.git ` )
257+ You can install the client using standard Python installation tools. It is hosted
258+ in PyPI with ` apache-airflow-client ` package id so the easiest way to get the latest
259+ version is to run:
253260
254- Then import the package:
255- ``` python
256- import airflow_client.client
261+ ``` bash
262+ pip install apache-airflow-client
257263```
258264
259- ### Setuptools
260-
261- Install via [ Setuptools] ( http://pypi.python.org/pypi/setuptools ) .
265+ If the python package is hosted on a repository, you can install directly using:
262266
263- ``` sh
264- python setup.py install --user
267+ ``` bash
268+ pip install git+https://github.com/apache/airflow-client-python.git
265269```
266- (or ` sudo python setup.py install ` to install the package for all users)
270+
271+ ### Import check
267272
268273Then import the package:
274+
269275``` python
270276import airflow_client.client
271277```
@@ -275,40 +281,34 @@ import airflow_client.client
275281Please follow the [ installation procedure] ( #installation--usage ) and then run the following:
276282
277283``` python
278-
279284import time
280285import airflow_client.client
281286from pprint import pprint
282287from airflow_client.client.api import config_api
283288from airflow_client.client.model.config import Config
284289from airflow_client.client.model.error import Error
290+
285291# Defining the host is optional and defaults to /api/v1
286292# See configuration.py for a list of all supported configuration parameters.
287- configuration = client.Configuration(
288- host = " /api/v1"
289- )
293+ configuration = client.Configuration(host = " /api/v1" )
290294
291295# The client must configure the authentication and authorization parameters
292296# in accordance with the API server security policy.
293297# Examples for each auth method are provided below, use the example that
294298# satisfies your auth use case.
295299
296300# Configure HTTP basic authorization: Basic
297- configuration = client.Configuration(
298- username = ' YOUR_USERNAME' ,
299- password = ' YOUR_PASSWORD'
300- )
301+ configuration = client.Configuration(username = " YOUR_USERNAME" , password = " YOUR_PASSWORD" )
301302
302303
303304# Enter a context with an instance of the API client
304305with client.ApiClient(configuration) as api_client:
305306 # Create an instance of the API class
306307 api_instance = config_api.ConfigApi(api_client)
307- section = " section_example" # str | If given, only return config of this section. (optional)
308308
309309 try :
310310 # Get current configuration
311- api_response = api_instance.get_config(section = section )
311+ api_response = api_instance.get_config()
312312 pprint(api_response)
313313 except client.ApiException as e:
314314 print (" Exception when calling ConfigApi->get_config: %s \n " % e)
@@ -321,7 +321,6 @@ All URIs are relative to */api/v1*
321321Class | Method | HTTP request | Description
322322------------ | ------------- | ------------- | -------------
323323* ConfigApi* | [ ** get_config** ] ( docs/ConfigApi.md#get_config ) | ** GET** /config | Get current configuration
324- * ConfigApi* | [ ** get_value** ] ( docs/ConfigApi.md#get_value ) | ** GET** /config/section/{section}/option/{option} | Get a option from configuration
325324* ConnectionApi* | [ ** delete_connection** ] ( docs/ConnectionApi.md#delete_connection ) | ** DELETE** /connections/{connection_id} | Delete a connection
326325* ConnectionApi* | [ ** get_connection** ] ( docs/ConnectionApi.md#get_connection ) | ** GET** /connections/{connection_id} | Get a connection
327326* ConnectionApi* | [ ** get_connections** ] ( docs/ConnectionApi.md#get_connections ) | ** GET** /connections | List connections
@@ -345,7 +344,7 @@ Class | Method | HTTP request | Description
345344* DAGRunApi* | [ ** get_dag_runs** ] ( docs/DAGRunApi.md#get_dag_runs ) | ** GET** /dags/{dag_id}/dagRuns | List DAG runs
346345* DAGRunApi* | [ ** get_dag_runs_batch** ] ( docs/DAGRunApi.md#get_dag_runs_batch ) | ** POST** /dags/~ /dagRuns/list | List DAG runs (batch)
347346* DAGRunApi* | [ ** get_upstream_dataset_events** ] ( docs/DAGRunApi.md#get_upstream_dataset_events ) | ** GET** /dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents | Get dataset events for a DAG run
348- * DAGRunApi* | [ ** post_dag_run** ] ( docs/DAGRunApi.md#post_dag_run ) | ** POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run.
347+ * DAGRunApi* | [ ** post_dag_run** ] ( docs/DAGRunApi.md#post_dag_run ) | ** POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run
349348* DAGRunApi* | [ ** set_dag_run_note** ] ( docs/DAGRunApi.md#set_dag_run_note ) | ** PATCH** /dags/{dag_id}/dagRuns/{dag_run_id}/setNote | Update the DagRun note.
350349* DAGRunApi* | [ ** update_dag_run_state** ] ( docs/DAGRunApi.md#update_dag_run_state ) | ** PATCH** /dags/{dag_id}/dagRuns/{dag_run_id} | Modify a DAG run
351350* DagWarningApi* | [ ** get_dag_warnings** ] ( docs/DagWarningApi.md#get_dag_warnings ) | ** GET** /dagWarnings | List dag warnings
@@ -427,7 +426,6 @@ Class | Method | HTTP request | Description
427426 - [ DAGRun] ( docs/DAGRun.md )
428427 - [ DAGRunCollection] ( docs/DAGRunCollection.md )
429428 - [ DAGRunCollectionAllOf] ( docs/DAGRunCollectionAllOf.md )
430- - [ DagProcessorStatus] ( docs/DagProcessorStatus.md )
431429 - [ DagScheduleDatasetReference] ( docs/DagScheduleDatasetReference.md )
432430 - [ DagState] ( docs/DagState.md )
433431 - [ DagWarning] ( docs/DagWarning.md )
@@ -488,11 +486,9 @@ Class | Method | HTTP request | Description
488486 - [ TimeDelta] ( docs/TimeDelta.md )
489487 - [ Trigger] ( docs/Trigger.md )
490488 - [ TriggerRule] ( docs/TriggerRule.md )
491- - [ TriggererStatus] ( docs/TriggererStatus.md )
492489 - [ UpdateDagRunState] ( docs/UpdateDagRunState.md )
493490 - [ UpdateTaskInstance] ( docs/UpdateTaskInstance.md )
494491 - [ UpdateTaskInstancesState] ( docs/UpdateTaskInstancesState.md )
495- - [ UpdateTaskState] ( docs/UpdateTaskState.md )
496492 - [ User] ( docs/User.md )
497493 - [ UserAllOf] ( docs/UserAllOf.md )
498494 - [ UserCollection] ( docs/UserCollection.md )
@@ -512,40 +508,104 @@ Class | Method | HTTP request | Description
512508 - [ XComCollectionAllOf] ( docs/XComCollectionAllOf.md )
513509 - [ XComCollectionItem] ( docs/XComCollectionItem.md )
514510
515-
516511## Documentation For Authorization
517512
513+ By default the generated client supports the three authentication schemes:
514+
515+ * Basic
516+ * GoogleOpenID
517+ * Kerberos
518518
519- ## Basic
519+ However, you can generate client and documentation with your own schemes by adding your own schemes in
520+ the security section of the OpenAPI specification. You can do it with Breeze CLI by adding the
521+ `` --security-schemes `` option to the `` breeze release-management prepare-python-client `` command.
520522
521- - ** Type ** : HTTP basic authentication
523+ ## Basic "smoke" tests
522524
525+ You can run basic smoke tests to check if the client is working properly - we have a simple test script
526+ that uses the API to run the tests. To do that, you need to:
523527
524- ## Kerberos
528+ * install the ` apache-airflow-client ` package as described above
529+ * install `` rich `` Python package
530+ * download the [ test_python_client.py] ( test_python_client.py ) file
531+ * make sure you have test airflow installation running. Do not experiment with your production deployment
532+ * configure your airflow webserver to enable basic authentication
533+ In the ` [api] ` section of your ` airflow.cfg ` set:
525534
535+ ``` ini
536+ [api]
537+ auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
538+ ```
526539
540+ You can also set it by env variable:
541+ ` export AIRFLOW__API__AUTH_BACKENDS=airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth `
527542
528- ## Author
543+ * configure your airflow webserver to load example dags
544+ In the ` [core] ` section of your ` airflow.cfg ` set:
529545
530- dev@airflow.apache.org
546+ ``` ini
547+ [core]
548+ load_examples = True
549+ ```
550+
551+ You can also set it by env variable: ` export AIRFLOW__CORE__LOAD_EXAMPLES=True `
552+
553+ * optionally expose configuration (NOTE! that this is dangerous setting). The script will happily run with
554+ the default setting, but if you want to see the configuration, you need to expose it.
555+ In the ` [webserver] ` section of your ` airflow.cfg ` set:
556+
557+ ``` ini
558+ [webserver]
559+ expose_config = True
560+ ```
561+
562+ You can also set it by env variable: ` export AIRFLOW__WEBSERVER__EXPOSE_CONFIG=True `
563+
564+ * Configure your host/ip/user/password in the ` test_python_client.py ` file
565+
566+ ``` python
567+ import airflow_client
568+
569+ # Configure HTTP basic authorization: Basic
570+ configuration = airflow_client.client.Configuration(
571+ host = " http://localhost:8080/api/v1" , username = " admin" , password = " admin"
572+ )
573+ ```
574+
575+ * Run scheduler (or dag file processor you have setup with standalone dag file processor) for few parsing
576+ loops (you can pass --num-runs parameter to it or keep it running in the background). The script relies
577+ on example DAGs being serialized to the DB and this only
578+ happens when scheduler runs with `` core/load_examples `` set to True.
579+
580+ * Run webserver - reachable at the host/port for the test script you want to run. Make sure it had enough
581+ time to initialize.
582+
583+ Run ` python test_python_client.py ` and you should see colored output showing attempts to connect and status.
531584
532585
533586## Notes for Large OpenAPI documents
587+
534588If the OpenAPI document is large, imports in client.apis and client.models may fail with a
535589RecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:
536590
537591Solution 1:
538592Use specific imports for apis and models like:
593+
539594- ` from airflow_client.client.api.default_api import DefaultApi `
540595- ` from airflow_client.client.model.pet import Pet `
541596
542597Solution 2:
543598Before importing the package, adjust the maximum recursion limit as shown below:
544- ```
599+
600+ ``` python
545601import sys
602+
546603sys.setrecursionlimit(1500 )
547604import airflow_client.client
548605from airflow_client.client.apis import *
549606from airflow_client.client.models import *
550607```
551608
609+ ## Authors
610+
611+ dev@airflow.apache.org
0 commit comments