Currently loads the following data:
| Table | Contains |
|---|---|
users |
Items of the users model that are users |
service_accounts |
Items of the users model that are service accounts |
The official verified source has a few drawbacks:
- on usage of the verified source, a copy of the current state of
the
dlt-hub/verified-sourcesrepository is copied into your project; Once you make changes to it, it effectively becomes a fork, making it hard to update after the fact. - This makes use of a preexisting client implementation which uses Pydantic models
Create a .dlt/secrets.toml with your API token:
airtable_token = "pat..."You can create this token here or as a Service Admin Account (preferred).
The scopes needed are as follows:
enterprise.user:readenterprise.account:read
and then run the default source with optional list references:
from dlt_source_airtable import source as airtable_source
pipeline = dlt.pipeline(
pipeline_name="airtable_pipeline",
destination="duckdb",
dev_mode=True,
)
enterprise_id = "ent..."
airtable_data = airtable_source(enterprise_id)
pipeline.run(airtable_data)Navigate to your admin view and you will see
Account ID: ent...
in the sidebar and/or URL bar of your browser.
This project is using devenv.
Commands:
| Command | What does it do? |
|---|---|
format |
Formats & lints all code |
sample-pipeline-run |
Runs the sample pipeline. |
sample-pipeline-show |
Starts the streamlit-based dlt hub |
AIRTABLE_TOKEN=[pat...] \
sample-pipeline-runalternatively you can also create a .dlt/secrets.toml
(excluded from git) with the following content:
airtable_token = "pat..."