Skip to content

[DOCS-13621] Rename Amazon S3 destination to Datadog Archives#35121

Open
maycmlee wants to merge 2 commits intomasterfrom
may/amazon-s3-dd-archives-generic
Open

[DOCS-13621] Rename Amazon S3 destination to Datadog Archives#35121
maycmlee wants to merge 2 commits intomasterfrom
may/amazon-s3-dd-archives-generic

Conversation

@maycmlee
Copy link
Contributor

@maycmlee maycmlee commented Mar 9, 2026

What does this PR do? What is the motivation?

Renames the Amazon S3 destination page to Datadog Archives destination. Updates all references, nav, and links.

Merge instructions

Merge readiness:

Do not merge.
Merge when generic S3 is available.

Additional notes

Image updates (amazon_s3_destination.png, amazon_s3_archive.png, amazon_s3_prefix_20250709.png) are a follow-up once updated screenshots are available.

@maycmlee maycmlee requested a review from a team as a code owner March 9, 2026 19:11
@maycmlee maycmlee added the WORK IN PROGRESS No review needed, it's a wip ;) label Mar 9, 2026
@github-actions github-actions bot added the Architecture Everything related to the Doc backend label Mar 9, 2026
{{< product-availability >}}

Use the Amazon S3 destination to send logs to Amazon S3. If you want to send logs to Amazon S3 for [archiving][1] and [rehydration][2], you must [configure Log Archives](#configure-log-archives). If you don't want to rehydrate your logs in Datadog, skip to [Set up the destination for your pipeline](#set-up-the-destination-for-your-pipeline).
Use the Datadog Archives destination to send logs to Amazon S3. If you want to send logs to Amazon S3 for [archiving][1] and [rehydration][2], you must [configure Log Archives](#configure-log-archives). If you don't want to rehydrate your logs in Datadog, skip to [Set up the destination for your pipeline](#set-up-the-destination-for-your-pipeline).
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If users do not want to rehydrate logs, we can update this to point them to the new generic S3 docs when its available.


You can route logs from Observability Pipelines to Snowflake using the Amazon S3 destination by configuring Snowpipe in Snowflake to automatically ingest those logs. To set this up:
You can route logs from Observability Pipelines to Snowflake using the Datadog Archives destination by configuring Snowpipe in Snowflake to automatically ingest those logs. To set this up:
1. Configure [Log Archives](#configure-log-archives) if you want to [archive][1] and [rehydrate][2] your logs. If you only want to send logs to Amazon S3, skip to step 2.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, if they don't want to archive and rehydrate, then use the generic S3 destination when it's available.

@maycmlee maycmlee removed the WORK IN PROGRESS No review needed, it's a wip ;) label Mar 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Architecture Everything related to the Doc backend

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants