diff --git a/docs/add-datastores/overview-of-a-dfs-datastore.md b/docs/add-datastores/overview-of-a-dfs-datastore.md
index 5f57964e4f..d324845b4b 100644
--- a/docs/add-datastores/overview-of-a-dfs-datastore.md
+++ b/docs/add-datastores/overview-of-a-dfs-datastore.md
@@ -47,7 +47,7 @@ DFS Datastores support the initiation of Profile Operations, allowing users to u
## Containers Overview
-For a more detailed understanding of how Qualytics manages and interacts with containers in DFS Datastores, please refer to the [Containers](../container/container.md) section in our comprehensive user guide.
+For a more detailed understanding of how Qualytics manages and interacts with containers in DFS Datastores, please refer to the [Containers](../container/overview.md) section in our comprehensive user guide.
This section covers topics such as container deletion, field deletion, and the initial profile of a Datastore's containers.
diff --git a/docs/add-datastores/overview-of-a-jdbc-datastore.md b/docs/add-datastores/overview-of-a-jdbc-datastore.md
index d385764cc4..5be07f6dc3 100644
--- a/docs/add-datastores/overview-of-a-jdbc-datastore.md
+++ b/docs/add-datastores/overview-of-a-jdbc-datastore.md
@@ -54,4 +54,4 @@ Qualytics employs weighted histogram analysis during the Catalog operation to in
## Containers Overview
-Containers are fundamental entities representing structured data sets. These containers could manifest as tables in JDBC datastores or as files within DFS datastores. They play a pivotal role in data organization, profiling, and quality checks within the Qualytics application. For a more detailed understanding of how Qualytics manages and interacts with containers in JDBC Datastores, please refer to the [**Containers overview**](../container/container.md) documentation.
\ No newline at end of file
+Containers are fundamental entities representing structured data sets. These containers could manifest as tables in JDBC datastores or as files within DFS datastores. They play a pivotal role in data organization, profiling, and quality checks within the Qualytics application. For a more detailed understanding of how Qualytics manages and interacts with containers in JDBC Datastores, please refer to the [**Containers overview**](../container/overview.md) documentation.
\ No newline at end of file
diff --git a/docs/assets/container/containers/actions-light.png b/docs/assets/container/actions-on-container/actions-light.png
similarity index 100%
rename from docs/assets/container/containers/actions-light.png
rename to docs/assets/container/actions-on-container/actions-light.png
diff --git a/docs/assets/container/containers/add-light.png b/docs/assets/container/actions-on-container/add-light.png
similarity index 100%
rename from docs/assets/container/containers/add-light.png
rename to docs/assets/container/actions-on-container/add-light.png
diff --git a/docs/assets/container/containers/run-light.png b/docs/assets/container/actions-on-container/run-light.png
similarity index 100%
rename from docs/assets/container/containers/run-light.png
rename to docs/assets/container/actions-on-container/run-light.png
diff --git a/docs/assets/container/containers/settings-light.png b/docs/assets/container/actions-on-container/settings-light.png
similarity index 100%
rename from docs/assets/container/containers/settings-light.png
rename to docs/assets/container/actions-on-container/settings-light.png
diff --git a/docs/assets/container/computed-field/add-field-dark-4.png b/docs/assets/container/computed-field/add-field-dark-4.png
deleted file mode 100644
index 14ace7c4b8..0000000000
Binary files a/docs/assets/container/computed-field/add-field-dark-4.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/caret-dark-14.png b/docs/assets/container/computed-field/caret-dark-14.png
deleted file mode 100644
index 339eb2015b..0000000000
Binary files a/docs/assets/container/computed-field/caret-dark-14.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/caret-light-14.png b/docs/assets/container/computed-field/caret-light-14.png
deleted file mode 100644
index ee89714dd4..0000000000
Binary files a/docs/assets/container/computed-field/caret-light-14.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/computed-field-dark-3.png b/docs/assets/container/computed-field/computed-field-dark-3.png
deleted file mode 100644
index 1f6096b358..0000000000
Binary files a/docs/assets/container/computed-field/computed-field-dark-3.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/container-dark-2.png b/docs/assets/container/computed-field/container-dark-2.png
deleted file mode 100644
index 02f630cff8..0000000000
Binary files a/docs/assets/container/computed-field/container-dark-2.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/datastore-dark-1.png b/docs/assets/container/computed-field/datastore-dark-1.png
deleted file mode 100644
index 1137e0641d..0000000000
Binary files a/docs/assets/container/computed-field/datastore-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/explore-dark.png b/docs/assets/container/computed-field/explore-dark.png
deleted file mode 100644
index d6d477dd39..0000000000
Binary files a/docs/assets/container/computed-field/explore-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/explore-light.png b/docs/assets/container/computed-field/explore-light.png
deleted file mode 100644
index c52ea10813..0000000000
Binary files a/docs/assets/container/computed-field/explore-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/field-created-dark-8.png b/docs/assets/container/computed-field/field-created-dark-8.png
deleted file mode 100644
index 8ee1ea8b46..0000000000
Binary files a/docs/assets/container/computed-field/field-created-dark-8.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/field-created-light-8.png b/docs/assets/container/computed-field/field-created-light-8.png
deleted file mode 100644
index 1deb01178f..0000000000
Binary files a/docs/assets/container/computed-field/field-created-light-8.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/fields-dark-1.png b/docs/assets/container/computed-field/fields-dark-1.png
deleted file mode 100644
index a6d14f93ec..0000000000
Binary files a/docs/assets/container/computed-field/fields-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/fields-dark.png b/docs/assets/container/computed-field/fields-dark.png
deleted file mode 100644
index 75f32f74d2..0000000000
Binary files a/docs/assets/container/computed-field/fields-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/fields-light.png b/docs/assets/container/computed-field/fields-light.png
deleted file mode 100644
index 702bfc8746..0000000000
Binary files a/docs/assets/container/computed-field/fields-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/filter-dark-15.png b/docs/assets/container/computed-field/filter-dark-15.png
deleted file mode 100644
index 58fb8c938d..0000000000
Binary files a/docs/assets/container/computed-field/filter-dark-15.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/filter-light-15.png b/docs/assets/container/computed-field/filter-light-15.png
deleted file mode 100644
index e12359ed6a..0000000000
Binary files a/docs/assets/container/computed-field/filter-light-15.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/fuzzy-field.png b/docs/assets/container/computed-field/fuzzy-field.png
deleted file mode 100644
index e3f7a77019..0000000000
Binary files a/docs/assets/container/computed-field/fuzzy-field.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/hover-dark-11.png b/docs/assets/container/computed-field/hover-dark-11.png
deleted file mode 100644
index bfb3910131..0000000000
Binary files a/docs/assets/container/computed-field/hover-dark-11.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/last-profile-dark.png b/docs/assets/container/computed-field/last-profile-dark.png
deleted file mode 100644
index 1431d716b4..0000000000
Binary files a/docs/assets/container/computed-field/last-profile-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/modal-dark.png b/docs/assets/container/computed-field/modal-dark.png
deleted file mode 100644
index ccd2e06e69..0000000000
Binary files a/docs/assets/container/computed-field/modal-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/modal-light.png b/docs/assets/container/computed-field/modal-light.png
deleted file mode 100644
index a15a637eb1..0000000000
Binary files a/docs/assets/container/computed-field/modal-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/plus-dark.png b/docs/assets/container/computed-field/plus-dark.png
deleted file mode 100644
index 7c9f741b5e..0000000000
Binary files a/docs/assets/container/computed-field/plus-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/plus-light.png b/docs/assets/container/computed-field/plus-light.png
deleted file mode 100644
index 420b2ce163..0000000000
Binary files a/docs/assets/container/computed-field/plus-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/profile-dark-10.png b/docs/assets/container/computed-field/profile-dark-10.png
deleted file mode 100644
index b805d8f6d5..0000000000
Binary files a/docs/assets/container/computed-field/profile-dark-10.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/profiles-dark.png b/docs/assets/container/computed-field/profiles-dark.png
deleted file mode 100644
index b237eac4f4..0000000000
Binary files a/docs/assets/container/computed-field/profiles-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/profiles-light.png b/docs/assets/container/computed-field/profiles-light.png
deleted file mode 100644
index d5e7b402fd..0000000000
Binary files a/docs/assets/container/computed-field/profiles-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/save-dark-6.png b/docs/assets/container/computed-field/save-dark-6.png
deleted file mode 100644
index b320257afb..0000000000
Binary files a/docs/assets/container/computed-field/save-dark-6.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/sort-dark-12.png b/docs/assets/container/computed-field/sort-dark-12.png
deleted file mode 100644
index f7a32c7d51..0000000000
Binary files a/docs/assets/container/computed-field/sort-dark-12.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/sort-dark-13.png b/docs/assets/container/computed-field/sort-dark-13.png
deleted file mode 100644
index fbe07b2ef6..0000000000
Binary files a/docs/assets/container/computed-field/sort-dark-13.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/sort-light-12.png b/docs/assets/container/computed-field/sort-light-12.png
deleted file mode 100644
index a83e62c1e5..0000000000
Binary files a/docs/assets/container/computed-field/sort-light-12.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/sort-light-13.png b/docs/assets/container/computed-field/sort-light-13.png
deleted file mode 100644
index ea0b996a58..0000000000
Binary files a/docs/assets/container/computed-field/sort-light-13.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/specific-dark.png b/docs/assets/container/computed-field/specific-dark.png
deleted file mode 100644
index 2616247948..0000000000
Binary files a/docs/assets/container/computed-field/specific-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/specific-light.png b/docs/assets/container/computed-field/specific-light.png
deleted file mode 100644
index 7070063791..0000000000
Binary files a/docs/assets/container/computed-field/specific-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/success-dark-7.png b/docs/assets/container/computed-field/success-dark-7.png
deleted file mode 100644
index 81e94936a7..0000000000
Binary files a/docs/assets/container/computed-field/success-dark-7.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/team-dark.png b/docs/assets/container/computed-field/team-dark.png
deleted file mode 100644
index 41f2c78cd7..0000000000
Binary files a/docs/assets/container/computed-field/team-dark.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/team-light.png b/docs/assets/container/computed-field/team-light.png
deleted file mode 100644
index 19c29182d3..0000000000
Binary files a/docs/assets/container/computed-field/team-light.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/totals-dark-9.png b/docs/assets/container/computed-field/totals-dark-9.png
deleted file mode 100644
index c2c71ee558..0000000000
Binary files a/docs/assets/container/computed-field/totals-dark-9.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/transformation-type-dark-5.png b/docs/assets/container/computed-field/transformation-type-dark-5.png
deleted file mode 100644
index bbaa06e4c5..0000000000
Binary files a/docs/assets/container/computed-field/transformation-type-dark-5.png and /dev/null differ
diff --git a/docs/assets/container/computed-field/add-field-light-4.png b/docs/assets/container/computed-fields/add-computed-fields/add-field-light-4.png
similarity index 100%
rename from docs/assets/container/computed-field/add-field-light-4.png
rename to docs/assets/container/computed-fields/add-computed-fields/add-field-light-4.png
diff --git a/docs/assets/container/computed-field/computed-field-light-3.png b/docs/assets/container/computed-fields/add-computed-fields/computed-field-light-3.png
similarity index 100%
rename from docs/assets/container/computed-field/computed-field-light-3.png
rename to docs/assets/container/computed-fields/add-computed-fields/computed-field-light-3.png
diff --git a/docs/assets/container/computed-field/container-light-2.png b/docs/assets/container/computed-fields/add-computed-fields/container-light-2.png
similarity index 100%
rename from docs/assets/container/computed-field/container-light-2.png
rename to docs/assets/container/computed-fields/add-computed-fields/container-light-2.png
diff --git a/docs/assets/container/computed-field/datastore-light-1.png b/docs/assets/container/computed-fields/add-computed-fields/datastore-light-1.png
similarity index 100%
rename from docs/assets/container/computed-field/datastore-light-1.png
rename to docs/assets/container/computed-fields/add-computed-fields/datastore-light-1.png
diff --git a/docs/assets/container/computed-fields/add-computed-fields/field-created-light-8.png b/docs/assets/container/computed-fields/add-computed-fields/field-created-light-8.png
new file mode 100644
index 0000000000..adf4314653
Binary files /dev/null and b/docs/assets/container/computed-fields/add-computed-fields/field-created-light-8.png differ
diff --git a/docs/assets/container/computed-field/fields-light-1.png b/docs/assets/container/computed-fields/add-computed-fields/fields-light-1.png
similarity index 100%
rename from docs/assets/container/computed-field/fields-light-1.png
rename to docs/assets/container/computed-fields/add-computed-fields/fields-light-1.png
diff --git a/docs/assets/container/computed-field/save-light-6.png b/docs/assets/container/computed-fields/add-computed-fields/save-light-6.png
similarity index 100%
rename from docs/assets/container/computed-field/save-light-6.png
rename to docs/assets/container/computed-fields/add-computed-fields/save-light-6.png
diff --git a/docs/assets/container/computed-field/success-light-7.png b/docs/assets/container/computed-fields/add-computed-fields/success-light-7.png
similarity index 100%
rename from docs/assets/container/computed-field/success-light-7.png
rename to docs/assets/container/computed-fields/add-computed-fields/success-light-7.png
diff --git a/docs/assets/container/computed-field/transformation-type-light-5.png b/docs/assets/container/computed-fields/add-computed-fields/transformation-type-light-5.png
similarity index 100%
rename from docs/assets/container/computed-field/transformation-type-light-5.png
rename to docs/assets/container/computed-fields/add-computed-fields/transformation-type-light-5.png
diff --git a/docs/assets/container/computed-field/hover-light-11.png b/docs/assets/container/computed-fields/computed-fields-details/hover-light-11.png
similarity index 100%
rename from docs/assets/container/computed-field/hover-light-11.png
rename to docs/assets/container/computed-fields/computed-fields-details/hover-light-11.png
diff --git a/docs/assets/container/computed-field/last-profile-light.png b/docs/assets/container/computed-fields/computed-fields-details/last-profile-light.png
similarity index 100%
rename from docs/assets/container/computed-field/last-profile-light.png
rename to docs/assets/container/computed-fields/computed-fields-details/last-profile-light.png
diff --git a/docs/assets/container/computed-field/profile-light-10.png b/docs/assets/container/computed-fields/computed-fields-details/profile-light-10.png
similarity index 100%
rename from docs/assets/container/computed-field/profile-light-10.png
rename to docs/assets/container/computed-fields/computed-fields-details/profile-light-10.png
diff --git a/docs/assets/container/computed-field/totals-light-9.png b/docs/assets/container/computed-fields/computed-fields-details/totals-light-9.png
similarity index 100%
rename from docs/assets/container/computed-field/totals-light-9.png
rename to docs/assets/container/computed-fields/computed-fields-details/totals-light-9.png
diff --git a/docs/assets/datastores/add-computed-tables-files/add-compute-file-light.png b/docs/assets/container/computed-tables-and-files/computed-files/add-compute-file-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/add-compute-file-light.png
rename to docs/assets/container/computed-tables-and-files/computed-files/add-compute-file-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/click-add-file-light.png b/docs/assets/container/computed-tables-and-files/computed-files/click-add-file-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/click-add-file-light.png
rename to docs/assets/container/computed-tables-and-files/computed-files/click-add-file-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/select-computed-file-light.png b/docs/assets/container/computed-tables-and-files/computed-files/select-computed-file-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/select-computed-file-light.png
rename to docs/assets/container/computed-tables-and-files/computed-files/select-computed-file-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/select-datastore-light.png b/docs/assets/container/computed-tables-and-files/computed-files/select-datastore-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/select-datastore-light.png
rename to docs/assets/container/computed-tables-and-files/computed-files/select-datastore-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/validate-file-light.png b/docs/assets/container/computed-tables-and-files/computed-files/validate-file-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/validate-file-light.png
rename to docs/assets/container/computed-tables-and-files/computed-files/validate-file-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/add-computed-table-light.png b/docs/assets/container/computed-tables-and-files/computed-tables/add-computed-table-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/add-computed-table-light.png
rename to docs/assets/container/computed-tables-and-files/computed-tables/add-computed-table-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/click-add-light.png b/docs/assets/container/computed-tables-and-files/computed-tables/click-add-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/click-add-light.png
rename to docs/assets/container/computed-tables-and-files/computed-tables/click-add-light.png
diff --git a/docs/assets/datastores/add-computed-tables-files/select-computed-table-light.png b/docs/assets/container/computed-tables-and-files/computed-tables/select-computed-table-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/select-computed-table-light.png
rename to docs/assets/container/computed-tables-and-files/computed-tables/select-computed-table-light.png
diff --git a/docs/assets/container/computed-tables-and-files/computed-tables/select-datastore-light.png b/docs/assets/container/computed-tables-and-files/computed-tables/select-datastore-light.png
new file mode 100644
index 0000000000..043c0f927f
Binary files /dev/null and b/docs/assets/container/computed-tables-and-files/computed-tables/select-datastore-light.png differ
diff --git a/docs/assets/datastores/add-computed-tables-files/validate.png b/docs/assets/container/computed-tables-and-files/computed-tables/validate.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/validate.png
rename to docs/assets/container/computed-tables-and-files/computed-tables/validate.png
diff --git a/docs/assets/datastores/add-computed-tables-files/team-light.png b/docs/assets/container/computed-tables-and-files/overview/team-light.png
similarity index 100%
rename from docs/assets/datastores/add-computed-tables-files/team-light.png
rename to docs/assets/container/computed-tables-and-files/overview/team-light.png
diff --git a/docs/assets/container/containers/anomalies-light.png b/docs/assets/container/container-attributes/anomalies-light.png
similarity index 100%
rename from docs/assets/container/containers/anomalies-light.png
rename to docs/assets/container/container-attributes/anomalies-light.png
diff --git a/docs/assets/container/containers/select-datastore-light.png b/docs/assets/container/container-attributes/select-datastore-light.png
similarity index 100%
rename from docs/assets/container/containers/select-datastore-light.png
rename to docs/assets/container/container-attributes/select-datastore-light.png
diff --git a/docs/assets/container/containers/totals-light.png b/docs/assets/container/container-attributes/totals-light.png
similarity index 100%
rename from docs/assets/container/containers/totals-light.png
rename to docs/assets/container/container-attributes/totals-light.png
diff --git a/docs/assets/container/containers/totalss-light.png b/docs/assets/container/containers/totalss-light.png
deleted file mode 100644
index c3e4f48fd7..0000000000
Binary files a/docs/assets/container/containers/totalss-light.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/all-option-dark-8.png b/docs/assets/container/export-metadata/all-option-dark-8.png
deleted file mode 100644
index 87c4ffd782..0000000000
Binary files a/docs/assets/container/export-metadata/all-option-dark-8.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/covid-19-dark-2.png b/docs/assets/container/export-metadata/covid-19-dark-2.png
deleted file mode 100644
index 880e636acc..0000000000
Binary files a/docs/assets/container/export-metadata/covid-19-dark-2.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/enrichment-dark-10.png b/docs/assets/container/export-metadata/enrichment-dark-10.png
deleted file mode 100644
index 694e31f64f..0000000000
Binary files a/docs/assets/container/export-metadata/enrichment-dark-10.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/example-dark-12.png b/docs/assets/container/export-metadata/example-dark-12.png
deleted file mode 100644
index 13394a8119..0000000000
Binary files a/docs/assets/container/export-metadata/example-dark-12.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/export-button-dark-7.png b/docs/assets/container/export-metadata/export-button-dark-7.png
deleted file mode 100644
index ab4e81b8f8..0000000000
Binary files a/docs/assets/container/export-metadata/export-button-dark-7.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/export-metadata-dark-3.png b/docs/assets/container/export-metadata/export-metadata-dark-3.png
deleted file mode 100644
index de407b2d99..0000000000
Binary files a/docs/assets/container/export-metadata/export-metadata-dark-3.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/exported-dark-11.png b/docs/assets/container/export-metadata/exported-dark-11.png
deleted file mode 100644
index 1972895915..0000000000
Binary files a/docs/assets/container/export-metadata/exported-dark-11.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/message-dark-9.png b/docs/assets/container/export-metadata/message-dark-9.png
deleted file mode 100644
index 0e7a08a299..0000000000
Binary files a/docs/assets/container/export-metadata/message-dark-9.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/modal-window-dark-4.png b/docs/assets/container/export-metadata/modal-window-dark-4.png
deleted file mode 100644
index c42c712040..0000000000
Binary files a/docs/assets/container/export-metadata/modal-window-dark-4.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/modal-window-dark-5.png b/docs/assets/container/export-metadata/modal-window-dark-5.png
deleted file mode 100644
index 6116cc8a9f..0000000000
Binary files a/docs/assets/container/export-metadata/modal-window-dark-5.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/next-button-dark-6.png b/docs/assets/container/export-metadata/next-button-dark-6.png
deleted file mode 100644
index c6d7dd511d..0000000000
Binary files a/docs/assets/container/export-metadata/next-button-dark-6.png and /dev/null differ
diff --git a/docs/assets/container/export-metadata/select-source-dark-1.png b/docs/assets/container/export-metadata/select-source-dark-1.png
deleted file mode 100644
index 13e31179f4..0000000000
Binary files a/docs/assets/container/export-metadata/select-source-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/assest-dark.png b/docs/assets/container/export-operation/assest-dark.png
deleted file mode 100644
index 7491225af6..0000000000
Binary files a/docs/assets/container/export-operation/assest-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/export-dark.png b/docs/assets/container/export-operation/export-dark.png
deleted file mode 100644
index 55ea97750d..0000000000
Binary files a/docs/assets/container/export-operation/export-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/msg-dark.png b/docs/assets/container/export-operation/msg-dark.png
deleted file mode 100644
index de0a24cc1a..0000000000
Binary files a/docs/assets/container/export-operation/msg-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/name-dark-1.png b/docs/assets/container/export-operation/name-dark-1.png
deleted file mode 100644
index 01c1d268d5..0000000000
Binary files a/docs/assets/container/export-operation/name-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/operation-dark.png b/docs/assets/container/export-operation/operation-dark.png
deleted file mode 100644
index f7dd8ff7a9..0000000000
Binary files a/docs/assets/container/export-operation/operation-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/operation-scheduled-dark.png b/docs/assets/container/export-operation/operation-scheduled-dark.png
deleted file mode 100644
index c7056e9871..0000000000
Binary files a/docs/assets/container/export-operation/operation-scheduled-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/profile-dark.png b/docs/assets/container/export-operation/profile-dark.png
deleted file mode 100644
index ff313d9f57..0000000000
Binary files a/docs/assets/container/export-operation/profile-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/profile2-dark.png b/docs/assets/container/export-operation/profile2-dark.png
deleted file mode 100644
index f985bff0ec..0000000000
Binary files a/docs/assets/container/export-operation/profile2-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/review-dark.png b/docs/assets/container/export-operation/review-dark.png
deleted file mode 100644
index 29773815ea..0000000000
Binary files a/docs/assets/container/export-operation/review-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/schedule-dark-1.png b/docs/assets/container/export-operation/schedule-dark-1.png
deleted file mode 100644
index 0914d22b46..0000000000
Binary files a/docs/assets/container/export-operation/schedule-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/schedule-dark.png b/docs/assets/container/export-operation/schedule-dark.png
deleted file mode 100644
index 815a88dc4e..0000000000
Binary files a/docs/assets/container/export-operation/schedule-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/select-dark.png b/docs/assets/container/export-operation/select-dark.png
deleted file mode 100644
index d890954446..0000000000
Binary files a/docs/assets/container/export-operation/select-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/snow-dark.png b/docs/assets/container/export-operation/snow-dark.png
deleted file mode 100644
index 9c0c071ce8..0000000000
Binary files a/docs/assets/container/export-operation/snow-dark.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/time-dark-1.png b/docs/assets/container/export-operation/time-dark-1.png
deleted file mode 100644
index 20145a5b19..0000000000
Binary files a/docs/assets/container/export-operation/time-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/export-operation/visible-dark.png b/docs/assets/container/export-operation/visible-dark.png
deleted file mode 100644
index 3476ca7b40..0000000000
Binary files a/docs/assets/container/export-operation/visible-dark.png and /dev/null differ
diff --git a/docs/assets/container/containers/change-light.png b/docs/assets/container/field-profile/change-light.png
similarity index 100%
rename from docs/assets/container/containers/change-light.png
rename to docs/assets/container/field-profile/change-light.png
diff --git a/docs/assets/container/containers/compare-light.png b/docs/assets/container/field-profile/compare-light.png
similarity index 100%
rename from docs/assets/container/containers/compare-light.png
rename to docs/assets/container/field-profile/compare-light.png
diff --git a/docs/assets/container/containers/last-profiled-light.png b/docs/assets/container/field-profile/last-profiled-light.png
similarity index 100%
rename from docs/assets/container/containers/last-profiled-light.png
rename to docs/assets/container/field-profile/last-profiled-light.png
diff --git a/docs/assets/container/containers/metric-chart-light.png b/docs/assets/container/field-profile/metric-chart-light.png
similarity index 100%
rename from docs/assets/container/containers/metric-chart-light.png
rename to docs/assets/container/field-profile/metric-chart-light.png
diff --git a/docs/assets/container/containers/profile-light.png b/docs/assets/container/field-profile/profile-light.png
similarity index 100%
rename from docs/assets/container/containers/profile-light.png
rename to docs/assets/container/field-profile/profile-light.png
diff --git a/docs/assets/container/field-profile/totals-light.png b/docs/assets/container/field-profile/totals-light.png
new file mode 100644
index 0000000000..cd4db5c5f5
Binary files /dev/null and b/docs/assets/container/field-profile/totals-light.png differ
diff --git a/docs/assets/identifiers/general-overview/datastore-light.png b/docs/assets/container/identifiers/general/datastore-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/datastore-light.png
rename to docs/assets/container/identifiers/general/datastore-light.png
diff --git a/docs/assets/identifiers/general-overview/excluding-light.png b/docs/assets/container/identifiers/general/excluding-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/excluding-light.png
rename to docs/assets/container/identifiers/general/excluding-light.png
diff --git a/docs/assets/identifiers/general-overview/grouping-light.png b/docs/assets/container/identifiers/general/grouping-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/grouping-light.png
rename to docs/assets/container/identifiers/general/grouping-light.png
diff --git a/docs/assets/identifiers/general-overview/list-light.png b/docs/assets/container/identifiers/general/list-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/list-light.png
rename to docs/assets/container/identifiers/general/list-light.png
diff --git a/docs/assets/identifiers/general-overview/navigation-light.png b/docs/assets/container/identifiers/general/navigation-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/navigation-light.png
rename to docs/assets/container/identifiers/general/navigation-light.png
diff --git a/docs/assets/identifiers/general-overview/settings-light.png b/docs/assets/container/identifiers/general/settings-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/settings-light.png
rename to docs/assets/container/identifiers/general/settings-light.png
diff --git a/docs/assets/identifiers/general-overview/window-light.png b/docs/assets/container/identifiers/general/window-light.png
similarity index 100%
rename from docs/assets/identifiers/general-overview/window-light.png
rename to docs/assets/container/identifiers/general/window-light.png
diff --git a/docs/assets/identifiers/grouping-overview/datastore-light.png b/docs/assets/container/identifiers/grouping/datastore-light.png
similarity index 100%
rename from docs/assets/identifiers/grouping-overview/datastore-light.png
rename to docs/assets/container/identifiers/grouping/datastore-light.png
diff --git a/docs/assets/identifiers/grouping-overview/list-light.png b/docs/assets/container/identifiers/grouping/list-light.png
similarity index 100%
rename from docs/assets/identifiers/grouping-overview/list-light.png
rename to docs/assets/container/identifiers/grouping/list-light.png
diff --git a/docs/assets/identifiers/grouping-overview/settings-light.png b/docs/assets/container/identifiers/grouping/settings-light.png
similarity index 100%
rename from docs/assets/identifiers/grouping-overview/settings-light.png
rename to docs/assets/container/identifiers/grouping/settings-light.png
diff --git a/docs/assets/identifiers/grouping-overview/table-light.png b/docs/assets/container/identifiers/grouping/table-light.png
similarity index 100%
rename from docs/assets/identifiers/grouping-overview/table-light.png
rename to docs/assets/container/identifiers/grouping/table-light.png
diff --git a/docs/assets/identifiers/grouping-overview/window-light.png b/docs/assets/container/identifiers/grouping/window-light.png
similarity index 100%
rename from docs/assets/identifiers/grouping-overview/window-light.png
rename to docs/assets/container/identifiers/grouping/window-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/datastore-light.png b/docs/assets/container/identifiers/identifiers/datastore-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/datastore-light.png
rename to docs/assets/container/identifiers/identifiers/datastore-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/incremental-light.png b/docs/assets/container/identifiers/identifiers/incremental-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/incremental-light.png
rename to docs/assets/container/identifiers/identifiers/incremental-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/list-light.png b/docs/assets/container/identifiers/identifiers/list-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/list-light.png
rename to docs/assets/container/identifiers/identifiers/list-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/partition-light.png b/docs/assets/container/identifiers/identifiers/partition-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/partition-light.png
rename to docs/assets/container/identifiers/identifiers/partition-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/setting-light.png b/docs/assets/container/identifiers/identifiers/setting-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/setting-light.png
rename to docs/assets/container/identifiers/identifiers/setting-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/table-menu-light.png b/docs/assets/container/identifiers/identifiers/table-menu-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/table-menu-light.png
rename to docs/assets/container/identifiers/identifiers/table-menu-light.png
diff --git a/docs/assets/identifiers/identifiers-overview/window-light.png b/docs/assets/container/identifiers/identifiers/window-light.png
similarity index 100%
rename from docs/assets/identifiers/identifiers-overview/window-light.png
rename to docs/assets/container/identifiers/identifiers/window-light.png
diff --git a/docs/assets/container/manage-tables-files/check-light-14.png b/docs/assets/container/manage-tables-and-files/add-checks/check-light-14.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/check-light-14.png
rename to docs/assets/container/manage-tables-and-files/add-checks/check-light-14.png
diff --git a/docs/assets/container/manage-tables-files/check-light-15.png b/docs/assets/container/manage-tables-and-files/add-checks/check-light-15.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/check-light-15.png
rename to docs/assets/container/manage-tables-and-files/add-checks/check-light-15.png
diff --git a/docs/assets/container/manage-tables-files/delete-btn-light-26.png b/docs/assets/container/manage-tables-and-files/delete/delete-btn-light-26.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/delete-btn-light-26.png
rename to docs/assets/container/manage-tables-and-files/delete/delete-btn-light-26.png
diff --git a/docs/assets/container/manage-tables-files/delete-light-25.png b/docs/assets/container/manage-tables-and-files/delete/delete-light-25.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/delete-light-25.png
rename to docs/assets/container/manage-tables-and-files/delete/delete-light-25.png
diff --git a/docs/assets/container/manage-tables-files/success-light-27.png b/docs/assets/container/manage-tables-and-files/delete/success-light-27.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/success-light-27.png
rename to docs/assets/container/manage-tables-and-files/delete/success-light-27.png
diff --git a/docs/assets/container/manage-tables-files/export-light-23.png b/docs/assets/container/manage-tables-and-files/export/export-light-23.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/export-light-23.png
rename to docs/assets/container/manage-tables-and-files/export/export-light-23.png
diff --git a/docs/assets/container/manage-tables-files/export-light-24.png b/docs/assets/container/manage-tables-and-files/export/export-light-24.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/export-light-24.png
rename to docs/assets/container/manage-tables-and-files/export/export-light-24.png
diff --git a/docs/assets/container/manage-tables-files/fav-msg-light-29.png b/docs/assets/container/manage-tables-and-files/mark-tables-and-files-as-favorite/fav-msg-light-29.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/fav-msg-light-29.png
rename to docs/assets/container/manage-tables-and-files/mark-tables-and-files-as-favorite/fav-msg-light-29.png
diff --git a/docs/assets/container/manage-tables-files/mark-fav-light-28.png b/docs/assets/container/manage-tables-and-files/mark-tables-and-files-as-favorite/mark-fav-light-28.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/mark-fav-light-28.png
rename to docs/assets/container/manage-tables-and-files/mark-tables-and-files-as-favorite/mark-fav-light-28.png
diff --git a/docs/assets/container/manage-tables-files/unmark-fav-light-30.png b/docs/assets/container/manage-tables-and-files/mark-tables-and-files-as-favorite/unmark-fav-light-30.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/unmark-fav-light-30.png
rename to docs/assets/container/manage-tables-and-files/mark-tables-and-files-as-favorite/unmark-fav-light-30.png
diff --git a/docs/assets/container/manage-tables-files/materialize-light-24.png b/docs/assets/container/manage-tables-and-files/materialize/materialize-light-24.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/materialize-light-24.png
rename to docs/assets/container/manage-tables-and-files/materialize/materialize-light-24.png
diff --git a/docs/assets/container/manage-tables-files/materialize-light-30.png b/docs/assets/container/manage-tables-and-files/materialize/materialize-light-30.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/materialize-light-30.png
rename to docs/assets/container/manage-tables-and-files/materialize/materialize-light-30.png
diff --git a/docs/assets/container/manage-tables-files/observability-light-18.png b/docs/assets/container/manage-tables-and-files/observability-settings/observability-light-18.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/observability-light-18.png
rename to docs/assets/container/manage-tables-and-files/observability-settings/observability-light-18.png
diff --git a/docs/assets/container/manage-tables-files/observability-light-19.png b/docs/assets/container/manage-tables-and-files/observability-settings/observability-light-19.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/observability-light-19.png
rename to docs/assets/container/manage-tables-and-files/observability-settings/observability-light-19.png
diff --git a/docs/assets/container/manage-tables-files/save-light-21.png b/docs/assets/container/manage-tables-and-files/observability-settings/save-light-21.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/save-light-21.png
rename to docs/assets/container/manage-tables-and-files/observability-settings/save-light-21.png
diff --git a/docs/assets/container/manage-tables-files/success-light-22.png b/docs/assets/container/manage-tables-and-files/observability-settings/success-light-22.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/success-light-22.png
rename to docs/assets/container/manage-tables-and-files/observability-settings/success-light-22.png
diff --git a/docs/assets/container/manage-tables-files/tracking-light-20.png b/docs/assets/container/manage-tables-and-files/observability-settings/tracking-light-20.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/tracking-light-20.png
rename to docs/assets/container/manage-tables-and-files/observability-settings/tracking-light-20.png
diff --git a/docs/assets/container/manage-tables-files/run-light-16.png b/docs/assets/container/manage-tables-and-files/run/run-light-16.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/run-light-16.png
rename to docs/assets/container/manage-tables-and-files/run/run-light-16.png
diff --git a/docs/assets/container/manage-tables-files/run-light-17.png b/docs/assets/container/manage-tables-and-files/run/run-light-17.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/run-light-17.png
rename to docs/assets/container/manage-tables-and-files/run/run-light-17.png
diff --git a/docs/assets/container/manage-tables-files/excluding-light-10.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/excluding-light-10.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/excluding-light-10.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/excluding-light-10.png
diff --git a/docs/assets/container/manage-tables-files/field-light-7.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/field-light-7.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/field-light-7.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/field-light-7.png
diff --git a/docs/assets/container/manage-tables-files/general-light-11.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/general-light-11.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/general-light-11.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/general-light-11.png
diff --git a/docs/assets/container/manage-tables-files/group-light-9.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/group-light-9.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/group-light-9.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/group-light-9.png
diff --git a/docs/assets/container/manage-tables-files/incremental-light-6.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/incremental-light-6.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/incremental-light-6.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/incremental-light-6.png
diff --git a/docs/assets/container/manage-tables-files/partition-light-8.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/partition-light-8.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/partition-light-8.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/partition-light-8.png
diff --git a/docs/assets/container/manage-tables-files/save-light-12.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/save-light-12.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/save-light-12.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/save-light-12.png
diff --git a/docs/assets/container/manage-tables-files/setting-light-4.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/setting-light-4.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/setting-light-4.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/setting-light-4.png
diff --git a/docs/assets/container/manage-tables-files/success-light-13.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/success-light-13.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/success-light-13.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/success-light-13.png
diff --git a/docs/assets/container/manage-tables-files/table-settings-light-5.png b/docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/table-settings-light-5.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/table-settings-light-5.png
rename to docs/assets/container/manage-tables-and-files/setting-for-jdbc-table/table-settings-light-5.png
diff --git a/docs/assets/container/manage-tables-files/excluding-light-100.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/excluding-light-100.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/excluding-light-100.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/excluding-light-100.png
diff --git a/docs/assets/container/manage-tables-files/general-light-111.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-111.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/general-light-111.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-111.png
diff --git a/docs/assets/container/manage-tables-files/general-light-112.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-112.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/general-light-112.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-112.png
diff --git a/docs/assets/container/manage-tables-files/general-light-113.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-113.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/general-light-113.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-113.png
diff --git a/docs/assets/container/manage-tables-files/general-light-114.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-114.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/general-light-114.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/general-light-114.png
diff --git a/docs/assets/container/manage-tables-files/group-light-99.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/group-light-99.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/group-light-99.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/group-light-99.png
diff --git a/docs/assets/container/manage-tables-files/save-light-122.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/save-light-122.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/save-light-122.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/save-light-122.png
diff --git a/docs/assets/container/manage-tables-files/setting-light-44.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/setting-light-44.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/setting-light-44.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/setting-light-44.png
diff --git a/docs/assets/container/manage-tables-files/success-light-133.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/success-light-133.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/success-light-133.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/success-light-133.png
diff --git a/docs/assets/container/manage-tables-files/table-settings-light-55.png b/docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/table-settings-light-55.png
similarity index 100%
rename from docs/assets/container/manage-tables-files/table-settings-light-55.png
rename to docs/assets/container/manage-tables-and-files/settings-for-dfs-files-pattern/table-settings-light-55.png
diff --git a/docs/assets/container/manage-tables-files/check-dark-14.png b/docs/assets/container/manage-tables-files/check-dark-14.png
deleted file mode 100644
index 1e6ac97257..0000000000
Binary files a/docs/assets/container/manage-tables-files/check-dark-14.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/check-dark-15.png b/docs/assets/container/manage-tables-files/check-dark-15.png
deleted file mode 100644
index 777660dc46..0000000000
Binary files a/docs/assets/container/manage-tables-files/check-dark-15.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/delete-btn-dark-26.png b/docs/assets/container/manage-tables-files/delete-btn-dark-26.png
deleted file mode 100644
index e5e78253d3..0000000000
Binary files a/docs/assets/container/manage-tables-files/delete-btn-dark-26.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/delete-dark-25.png b/docs/assets/container/manage-tables-files/delete-dark-25.png
deleted file mode 100644
index 1074da52bc..0000000000
Binary files a/docs/assets/container/manage-tables-files/delete-dark-25.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/excluding-dark-10.png b/docs/assets/container/manage-tables-files/excluding-dark-10.png
deleted file mode 100644
index d912da0985..0000000000
Binary files a/docs/assets/container/manage-tables-files/excluding-dark-10.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/excluding-dark-100.png b/docs/assets/container/manage-tables-files/excluding-dark-100.png
deleted file mode 100644
index 54cf1757bc..0000000000
Binary files a/docs/assets/container/manage-tables-files/excluding-dark-100.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/export-dark-23.png b/docs/assets/container/manage-tables-files/export-dark-23.png
deleted file mode 100644
index a7cf956276..0000000000
Binary files a/docs/assets/container/manage-tables-files/export-dark-23.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/export-dark-24.png b/docs/assets/container/manage-tables-files/export-dark-24.png
deleted file mode 100644
index 92c82e4395..0000000000
Binary files a/docs/assets/container/manage-tables-files/export-dark-24.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/fav-msg-dark-29.png b/docs/assets/container/manage-tables-files/fav-msg-dark-29.png
deleted file mode 100644
index f81b286308..0000000000
Binary files a/docs/assets/container/manage-tables-files/fav-msg-dark-29.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/field-dark-7.png b/docs/assets/container/manage-tables-files/field-dark-7.png
deleted file mode 100644
index a6a268419d..0000000000
Binary files a/docs/assets/container/manage-tables-files/field-dark-7.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/general-dark-11.png b/docs/assets/container/manage-tables-files/general-dark-11.png
deleted file mode 100644
index ecd206b245..0000000000
Binary files a/docs/assets/container/manage-tables-files/general-dark-11.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/general-dark-111.png b/docs/assets/container/manage-tables-files/general-dark-111.png
deleted file mode 100644
index 4780c7a413..0000000000
Binary files a/docs/assets/container/manage-tables-files/general-dark-111.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/general-dark-112.png b/docs/assets/container/manage-tables-files/general-dark-112.png
deleted file mode 100644
index 716707ce22..0000000000
Binary files a/docs/assets/container/manage-tables-files/general-dark-112.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/general-dark-113.png b/docs/assets/container/manage-tables-files/general-dark-113.png
deleted file mode 100644
index 6239d25ed9..0000000000
Binary files a/docs/assets/container/manage-tables-files/general-dark-113.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/general-dark-114.png b/docs/assets/container/manage-tables-files/general-dark-114.png
deleted file mode 100644
index f170dbef4d..0000000000
Binary files a/docs/assets/container/manage-tables-files/general-dark-114.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/group-dark-9.png b/docs/assets/container/manage-tables-files/group-dark-9.png
deleted file mode 100644
index 507a06af9d..0000000000
Binary files a/docs/assets/container/manage-tables-files/group-dark-9.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/group-dark-99.png b/docs/assets/container/manage-tables-files/group-dark-99.png
deleted file mode 100644
index b1532c8819..0000000000
Binary files a/docs/assets/container/manage-tables-files/group-dark-99.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/incremental-dark-6.png b/docs/assets/container/manage-tables-files/incremental-dark-6.png
deleted file mode 100644
index 7abec95cf1..0000000000
Binary files a/docs/assets/container/manage-tables-files/incremental-dark-6.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/mark-fav-dark-28.png b/docs/assets/container/manage-tables-files/mark-fav-dark-28.png
deleted file mode 100644
index 8984844d7e..0000000000
Binary files a/docs/assets/container/manage-tables-files/mark-fav-dark-28.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/materialize-dark-24.png b/docs/assets/container/manage-tables-files/materialize-dark-24.png
deleted file mode 100644
index bb82a0720b..0000000000
Binary files a/docs/assets/container/manage-tables-files/materialize-dark-24.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/materialize-dark-30.png b/docs/assets/container/manage-tables-files/materialize-dark-30.png
deleted file mode 100644
index b1c5e62a98..0000000000
Binary files a/docs/assets/container/manage-tables-files/materialize-dark-30.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/observability-dark-18.png b/docs/assets/container/manage-tables-files/observability-dark-18.png
deleted file mode 100644
index 47795432fb..0000000000
Binary files a/docs/assets/container/manage-tables-files/observability-dark-18.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/observability-dark-19.png b/docs/assets/container/manage-tables-files/observability-dark-19.png
deleted file mode 100644
index 48833b2c4e..0000000000
Binary files a/docs/assets/container/manage-tables-files/observability-dark-19.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/partition-dark-8.png b/docs/assets/container/manage-tables-files/partition-dark-8.png
deleted file mode 100644
index aaad56fbc4..0000000000
Binary files a/docs/assets/container/manage-tables-files/partition-dark-8.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/run-dark-16.png b/docs/assets/container/manage-tables-files/run-dark-16.png
deleted file mode 100644
index 4034746fa3..0000000000
Binary files a/docs/assets/container/manage-tables-files/run-dark-16.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/run-dark-17.png b/docs/assets/container/manage-tables-files/run-dark-17.png
deleted file mode 100644
index e73aa211d9..0000000000
Binary files a/docs/assets/container/manage-tables-files/run-dark-17.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/save-dark-12.png b/docs/assets/container/manage-tables-files/save-dark-12.png
deleted file mode 100644
index 1dba32736e..0000000000
Binary files a/docs/assets/container/manage-tables-files/save-dark-12.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/save-dark-122.png b/docs/assets/container/manage-tables-files/save-dark-122.png
deleted file mode 100644
index 2a031e51e8..0000000000
Binary files a/docs/assets/container/manage-tables-files/save-dark-122.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/save-dark-21.png b/docs/assets/container/manage-tables-files/save-dark-21.png
deleted file mode 100644
index 1ebb88b88b..0000000000
Binary files a/docs/assets/container/manage-tables-files/save-dark-21.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/select-dark-1.png b/docs/assets/container/manage-tables-files/select-dark-1.png
deleted file mode 100644
index 89d06762bb..0000000000
Binary files a/docs/assets/container/manage-tables-files/select-dark-1.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/select-light-1.png b/docs/assets/container/manage-tables-files/select-light-1.png
deleted file mode 100644
index bc516ae8b8..0000000000
Binary files a/docs/assets/container/manage-tables-files/select-light-1.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/setting-dark-4.png b/docs/assets/container/manage-tables-files/setting-dark-4.png
deleted file mode 100644
index 96f9504da2..0000000000
Binary files a/docs/assets/container/manage-tables-files/setting-dark-4.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/setting-dark-44.png b/docs/assets/container/manage-tables-files/setting-dark-44.png
deleted file mode 100644
index 851e268cb4..0000000000
Binary files a/docs/assets/container/manage-tables-files/setting-dark-44.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/success-dark-13.png b/docs/assets/container/manage-tables-files/success-dark-13.png
deleted file mode 100644
index a90a3e6b47..0000000000
Binary files a/docs/assets/container/manage-tables-files/success-dark-13.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/success-dark-133.png b/docs/assets/container/manage-tables-files/success-dark-133.png
deleted file mode 100644
index 59b0d23e22..0000000000
Binary files a/docs/assets/container/manage-tables-files/success-dark-133.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/success-dark-22.png b/docs/assets/container/manage-tables-files/success-dark-22.png
deleted file mode 100644
index 5990195ddc..0000000000
Binary files a/docs/assets/container/manage-tables-files/success-dark-22.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/success-dark-27.png b/docs/assets/container/manage-tables-files/success-dark-27.png
deleted file mode 100644
index f42d69ddbb..0000000000
Binary files a/docs/assets/container/manage-tables-files/success-dark-27.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/table-dark-2.png b/docs/assets/container/manage-tables-files/table-dark-2.png
deleted file mode 100644
index c871e6535f..0000000000
Binary files a/docs/assets/container/manage-tables-files/table-dark-2.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/table-dark-3.png b/docs/assets/container/manage-tables-files/table-dark-3.png
deleted file mode 100644
index 8ad9422d27..0000000000
Binary files a/docs/assets/container/manage-tables-files/table-dark-3.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/table-light-2.png b/docs/assets/container/manage-tables-files/table-light-2.png
deleted file mode 100644
index 431ae63a4a..0000000000
Binary files a/docs/assets/container/manage-tables-files/table-light-2.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/table-light-3.png b/docs/assets/container/manage-tables-files/table-light-3.png
deleted file mode 100644
index 2274bdcf31..0000000000
Binary files a/docs/assets/container/manage-tables-files/table-light-3.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/table-settings-dark-5.png b/docs/assets/container/manage-tables-files/table-settings-dark-5.png
deleted file mode 100644
index 9109dfd68b..0000000000
Binary files a/docs/assets/container/manage-tables-files/table-settings-dark-5.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/table-settings-dark-55.png b/docs/assets/container/manage-tables-files/table-settings-dark-55.png
deleted file mode 100644
index 2511651381..0000000000
Binary files a/docs/assets/container/manage-tables-files/table-settings-dark-55.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/tracking-dark-20.png b/docs/assets/container/manage-tables-files/tracking-dark-20.png
deleted file mode 100644
index e2d312c797..0000000000
Binary files a/docs/assets/container/manage-tables-files/tracking-dark-20.png and /dev/null differ
diff --git a/docs/assets/container/manage-tables-files/unmark-fav-dark-30.png b/docs/assets/container/manage-tables-files/unmark-fav-dark-30.png
deleted file mode 100644
index 6e60178de7..0000000000
Binary files a/docs/assets/container/manage-tables-files/unmark-fav-dark-30.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifier-menu-dark.png b/docs/assets/identifiers/identifier-menu-dark.png
deleted file mode 100644
index 31205278b5..0000000000
Binary files a/docs/assets/identifiers/identifier-menu-dark.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifier-menu-light.png b/docs/assets/identifiers/identifier-menu-light.png
deleted file mode 100644
index 7395d4482c..0000000000
Binary files a/docs/assets/identifiers/identifier-menu-light.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifier-screen.png b/docs/assets/identifiers/identifier-screen.png
deleted file mode 100644
index 543a6812c6..0000000000
Binary files a/docs/assets/identifiers/identifier-screen.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifiers-dark.png b/docs/assets/identifiers/identifiers-dark.png
deleted file mode 100644
index ed60bec310..0000000000
Binary files a/docs/assets/identifiers/identifiers-dark.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifiers-light.png b/docs/assets/identifiers/identifiers-light.png
deleted file mode 100644
index 45e110765e..0000000000
Binary files a/docs/assets/identifiers/identifiers-light.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifiers-screen-dark.png b/docs/assets/identifiers/identifiers-screen-dark.png
deleted file mode 100644
index 609d304e2d..0000000000
Binary files a/docs/assets/identifiers/identifiers-screen-dark.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifiers-screen-light.png b/docs/assets/identifiers/identifiers-screen-light.png
deleted file mode 100644
index 358aa2e543..0000000000
Binary files a/docs/assets/identifiers/identifiers-screen-light.png and /dev/null differ
diff --git a/docs/assets/identifiers/identifiers.png b/docs/assets/identifiers/identifiers.png
deleted file mode 100644
index f4f714ed87..0000000000
Binary files a/docs/assets/identifiers/identifiers.png and /dev/null differ
diff --git a/docs/container/actions-on-container.md b/docs/container/actions-on-container.md
new file mode 100644
index 0000000000..9bf5c160c7
--- /dev/null
+++ b/docs/container/actions-on-container.md
@@ -0,0 +1,44 @@
+# Actions on Container
+
+Users can perform various operations on containers to manage datasets effectively. The actions are divided into three main sections: **Settings**, **Add**, and **Run**. Each section contains specific options to perform different tasks.
+
+
+
+## Settings
+
+The **Settings** button allows users to configure the container. By clicking on the **Settings** button, users can access the following options:
+
+
+
+| No | Options | Description |
+| :---- | :---- | :---- |
+| **1.** | Settings | Configure incremental strategy, partitioning fields, and exclude specific fields from analysis. |
+| **2.** | Score | Score allowing you to adjust the decay period and factor weights for metrics like completeness, accuracy, and consistency.
**Note:** To understand how each score metric works in detail (completeness, accuracy, consistency, decay period, and weights), please refer to the [Quality Score Page](../quality-scores/what-are-quality-scores.md){target="_blank"}.
Score settings modified here apply **only to this container** and do not affect any other container in the datastore.|
+| **3.** | Observability | Enables or disables tracking for data volume and freshness.
**Volume Tracking:** Monitors daily volume metrics to identify trends and detect anomalies over time.
**Freshness Tracking:** Records the last update timestamp to ensure data timeliness and detect pipeline delays. |
+| **4.** | Migrate | Migrate authored quality checks from one container to another (even across datastores) to quickly reuse, standardize, and avoid recreating rules. |
+| **5.** | Export | Export quality checks, field profiles, and anomalies to an enrichment datastore for further action or analysis. |
+| **6.** | Materialize | Captures snapshots of data from a source datastore and exports it to an enrichment datastore for faster access and analysis. |
+| **7.** | Delete | Delete the selected container from the system. |
+
+## Add
+
+The **Add** button allows users to add checks or computed fields. By clicking on the **Add** button, users can access the following options:
+
+
+
+| No. | Options | Description |
+| :---- | :---- | :---- |
+| **1.** | Checks | Checks allow you to add new checks or validation rules for the container.
**Note:** To learn how to add checks, refer to the [Check Templates documentation](../checks/checks-template.md){target="_blank"}.|
+| **2.** | Computed Field | Allows you to add a computed field.
**Note:** To learn how to create a computed field, refer to the [Computed Field Guide](../container/computed-fields/add-computed-fields.md){target="_blank"}.|
+
+## Run
+
+The **Run** button provides options to execute operations on datasets, such as profiling, scanning, and external scans. By clicking on the **Run** button, users can access the following options:
+
+
+
+| No. | Options | Description |
+| :---- | :---- | :---- |
+| **1.** | Profile | **Profile** allows you to run a profiling operation to analyze the data structure, gather metadata, set thresholds, and define record limits for comprehensive dataset profiling.
**Note:** For profile operation, please refer to the [Profile Operation documentation](../source-datastore/profile.md){target="_blank"}. |
+| **2.** | Scan | **Scan** allows you to perform data quality checks, configure scan strategies, and detect anomalies in the dataset.
**Note:** For scan operation, please refer to the [Scan Operation documentation](../source-datastore/scan.md){target="_blank"}. |
+| **3.** | External Scan | **External Scan** allows you to upload a file and validate its data against predefined checks in the selected table.
**Note:** For external scan, please refer to the [ External Scan documentation](../source-datastore/external-scan.md){target="_blank"}. |
\ No newline at end of file
diff --git a/docs/container/computed-fields/add-computed-fields.md b/docs/container/computed-fields/add-computed-fields.md
new file mode 100644
index 0000000000..29696ab8bb
--- /dev/null
+++ b/docs/container/computed-fields/add-computed-fields.md
@@ -0,0 +1,52 @@
+# Add Computed Fields
+
+**Step 1:** Log in to Your Qualytics Account, navigate to the side menu, and select the **source datastore** where you want to create a computed field.
+
+
+
+**Step 2:** Select the **Container** within the chosen datastore where you want to create the computed field. This container holds the data to which the new computed field will be applied, enabling you to enhance your data analysis within that specific datastore.
+
+For demonstration purposes, we have selected the **Bank Dataset-Staging** source datastore and the **bank_transactions_.csv** container within it to create a computed field.
+
+
+
+**Step 3:** After selecting the container, click on the **Add** button and select **Computed Field** from the dropdown menu to create a new computed field.
+
+
+
+A modal window will appear, allowing you to enter the details for your computed field.
+
+
+
+**Step 4:** Enter the **Name** for the computed field, select **Transformation Type** from the dropdown menu, and optionally add **Additional Metadata**.
+
+| REF. | FIELDS | ACTION |
+|------|--------|--------|
+| 1 | Field Name (Required) | Add a unique name for your computed field. |
+| 2 | Transformation Type (Required) | The type of transformation you want to apply from the available options. |
+| 3 | Additional Metadata (Optional) | Enhance the computed field definition by setting custom metadata. Click the plus icon **(+)** to open the metadata input form and add key-value pairs. |
+
+
+
+!!! info
+ Transformations are changes made to data, like converting formats, doing calculations, or cleaning up fields. In Qualytics, you can use transformations to meet specific needs, such as cleaning entity names, converting formatted numbers, or applying custom expressions. With various transformation types available, Qualytics enables you to customize your data directly within the platform, ensuring it’s accurate and ready for analysis.
+
+| Transformation Types | Purpose | Reference |
+|------|--------|---------|
+| Cleaned Entity Name | Removes business signifiers (such as 'Inc.' or 'Corp') from an entity name. | For more information, please refer to the guide [cleaned entity name section.](../computed-fields/transformation-types.md#cleaned-entity-name){target="_blank"} |
+| Convert Formatted Numeric | Removes formatting (such as parentheses for denoting negatives or commas as delimiters) from values that represent numeric data, converting them into a numerically typed field. | For more information, please refer to the guide [convert formatted numeric section.](../computed-fields/transformation-types.md#convert-formatted-numeric){target="_blank"} |
+| Custom Expression | Allows you to create a new field by applying any valid Spark SQL expression to one or more existing fields. | For more information, please refer to the guide [custom expression section.](../computed-fields/transformation-types.md#custom-expression){target="_blank"} |
+
+
+
+**Step 5:** After selecting the appropriate **Transformation Type**, click the **Save** button.
+
+
+
+**Step 6:** After clicking on the **Save** button, your computed field is created and a success flash message will display saying **The computed field has been successfully created**.
+
+
+
+You can find your computed field by clicking on the dropdown arrow next to the container you selected when creating the computed field.
+
+
\ No newline at end of file
diff --git a/docs/container/computed-fields/computed-fields-details.md b/docs/container/computed-fields/computed-fields-details.md
new file mode 100644
index 0000000000..d78c541495
--- /dev/null
+++ b/docs/container/computed-fields/computed-fields-details.md
@@ -0,0 +1,50 @@
+# Computed Fields Details
+
+Computed Field Details provides a quick overview of the metrics generated from a computed field. The **Totals** section summarizes the results produced by this computed field and displays a report that reflects only the data output of this specific computed field.
+
+### Totals
+
+**1 Quality Score**: This provides a comprehensive assessment of the overall health of the data, factoring in multiple checks for accuracy, consistency, and completeness. A higher score, closer to 100, indicates optimal data quality with minimal issues or errors detected. A lower score may highlight areas that require attention and improvement.
+
+**2 Sampling**: This shows the percentage of data that was evaluated during profiling. A sampling rate of 100% indicates that the entire dataset was analyzed, ensuring a complete and accurate representation of the data’s quality across all records, rather than just a partial sample.
+
+**3 Completeness**: This metric measures how fully the data is populated without missing or null values. A higher completeness percentage means that most fields contain the necessary information, while a lower percentage indicates data gaps that could negatively impact downstream processes or analysis.
+
+**4 Active Checks**: This refers to the number of ongoing quality checks being applied to the dataset. These checks monitor aspects such as format consistency, uniqueness, and logical correctness. Active checks help maintain data integrity and provide real-time alerts about potential issues that may arise.
+
+**5 Active Anomalies**: This tracks the number of anomalies or irregularities detected in the data. These could include outliers, duplicates, or inconsistencies that deviate from expected patterns. A count of zero indicates no anomalies, while a higher count suggests that further investigation is needed to resolve potential data quality issues.
+
+
+
+### Profile
+
+This provides detailed insights into the characteristics of the field, including its type, distinct values, and length. You can use this information to evaluate the data's uniqueness, length consistency, and complexity.
+
+| **No** | **Profile** | **Description** |
+|--------|-----------------------|---------------------------------------------------------------------------------|
+| 1 | Declared Type | Indicates whether the type is declared by the source or inferred. |
+| 2 | Distinct Values | Count of distinct values observed in the dataset. |
+| 3 | Min Length | Shortest length of the observed string values or lowest value for numerics. |
+| 4 | Max Length | Greatest length of the observed string values or highest value for numerics. |
+| 5 | Mean | Mathematical average of the observed numeric values. |
+| 6 | Median | The median of the observed numeric values. |
+| 7 | Standard Deviation | Measure of the amount of variation in observed numeric values. |
+| 8 | Kurtosis | Measure of the ‘tailedness’ of the distribution of observed numeric values. |
+| 9 | Skewness | Measure of the asymmetry of the distribution of observed numeric values. |
+| 10 | Q1 | The first quartile; the central point between the minimum and the median. |
+| 11 | Q3 | The third quartile; the central point between the median and the maximum. |
+| 12 | Sum | Total sum of all observed numeric values. |
+
+
+
+You can hover over the **(i)** button to view the native field properties, which provide detailed information such as the field's type (numeric), size, decimal digits, and whether it allows null values.
+
+
+
+#### Last Profile
+
+The **Last Profile** timestamp helps users understand how up to date the field is. When you hover over the time indicator shown on the right side of the Last Profile label (e.g., "8 months ago"), a tooltip displays the complete date and time the field was last profiled.
+
+
+
+This visibility ensures better context for interpreting profile metrics like mean, completeness, and anomalies.
\ No newline at end of file
diff --git a/docs/container/computed-fields/overview.md b/docs/container/computed-fields/overview.md
new file mode 100644
index 0000000000..5deabbe089
--- /dev/null
+++ b/docs/container/computed-fields/overview.md
@@ -0,0 +1,25 @@
+# Computed Fields
+
+Computed Fields allow you to enhance data analysis by applying dynamic transformations directly to your data. These fields let you create new data points, perform calculations, and customize data views based on your specific needs, ensuring your data is both accurate and actionable.
+
+Let's get started 🚀
+
+## Add Computed Fields
+
+**Step 1:** Log in to Your Qualytics Account, navigate to the side menu, and select the **source datastore** where you want to add a computed field.
+
+!!! note
+ For next steps please refer to the [add computed field documentation](../computed-fields/add-computed-fields.md){target="_blank"}.
+
+## Computed Fields Details
+
+### Totals
+
+!!! note
+ For more information, please refer to the [computed fields details documentation](../computed-fields/computed-fields-details.md){target="_blank"}.
+
+## Types of Transformations
+
+!!! note
+ For more information, please refer to the [types of transformations](../computed-fields/transformation-types.md){target="_blank"}.
+
\ No newline at end of file
diff --git a/docs/container/computed-fields/transformation-types.md b/docs/container/computed-fields/transformation-types.md
new file mode 100644
index 0000000000..4e547d2684
--- /dev/null
+++ b/docs/container/computed-fields/transformation-types.md
@@ -0,0 +1,78 @@
+# Types of Transformations
+
+## Cleaned Entity Name
+
+This transformation removes common business signifiers from entity names, making your data cleaner and more uniform.
+
+### Options for Cleaned Entity Name
+
+| REF. | FIELDS | ACTIONS |
+|------|--------|---------|
+| 1 | **Drop from Suffix** | Add a unique name for your computed field. |
+| 2 | **Drop from Prefix** | Removes specified terms from the beginning of the entity name. |
+| 3 | **Drop from Interior** | Removes specified terms from the beginning of the entity name. |
+| 4 | **Additional Terms to Drop** (Custom) | Allows you to specify additional terms that should be dropped from the entity name. |
+| 5 | **Terms to Ignore** (Custom) | Designate terms that should be ignored during the cleaning process. |
+
+### Example for Cleaned Entity Name
+
+| **Example** | **Input** | **Transformation** | **Output** |
+|-------------|----------------------------|--------------------------------|--------------------------|
+| 1 | "TechCorp, Inc." | **Drop from Suffix**: "Inc." | "TechCorp" |
+| 2 | "Global Services Ltd." | **Drop from Prefix**: "Global" | "Services Ltd." |
+| 3 | "Central LTD & Finance Co." | **Drop from Interior**: "LTD" | "Central & Finance Co." |
+| 4 | "Eat & Drink LLC" | **Additional Terms to Drop**: "LLC", "&" | "Eat Drink" |
+| 5 | "ProNet Solutions Ltd." | **Terms to Ignore**: "Ltd." | "ProNet Solutions" |
+
+## Convert Formatted Numeric
+
+This transformation converts formatted numeric values into a plain numeric format, stripping out any characters like commas or parentheses that are not numerically significant.
+
+### Example for Convert Formatted Numeric
+
+| **Example** | **Input** | **Transformation** | **Output** |
+|-------------|-------------|--------------------|------------|
+| 1 | "$1,234.56" | **Remove non-numeric characters**: ",", "$" | "1234.56" |
+| 2 | "(2020)" | **Remove non-numeric characters**: "(", ")" | "-2020" |
+| 3 | "100%" | **Remove non-numeric characters**: "%" | "100"
+
+## Custom Expression
+
+Enables the creation of a field based on a custom computation using Spark SQL. This is useful for applying complex logic or transformations that are not covered by other types.
+
+### Using Custom Expression:
+ You can combine multiple fields, apply conditional logic, or use any valid Spark SQL functions to derive your new computed field.
+
+ **Example**: To create a field that sums two existing fields, you could use the expression `SUM(field1, field2)`.
+
+ **Advanced Example**: You need to ensure that a log of leases has no overlapping dates for an asset, but your data only captures a single lease's details like:
+
+| **LeaseID** | **AssetID** | **Lease_Start** | **Lease_End** |
+|-------------|-------------|--------------------|------------|
+| 1 | 42 | 1/1/2025 | 2/1/2026 |
+| 2 | 43 | 1/1/2025 | 2/1/2026 |
+| 3 | 42 | 1/1/2026 | 2/1/2026 |
+| 4 | 43 | 2/2/2026 | 2/1/2027 |
+
+You can see in this example that **Lease 1** has overlapping dates with **Lease 3** for the same Asset. This can be difficult to detect without a full transformation of the data. However, we can accomplish our goal easily with a Computed Field.
+We'll simply add a Computed Field to our table named **"Next_Lease_Start"** and define it with the following custom expression so that our table will now hold the new field and render it as shown below.
+
+`LEAD(Lease_Start, 1) OVER (PARTITION BY AssetID ORDER BY Lease_Start)`
+
+ | **LeaseID** | **AssetID** | **Lease_Start** | **Lease_End** | **Next_Lease_Start** |
+|-------------|-------------|--------------------|------------|------------|
+| 1 | 42 | 1/1/2025 | 2/1/2026 | 1/1/2026 |
+| 2 | 43 | 1/1/2025 | 2/1/2026 | 2/2/2026 |
+| 3 | 42 | 1/1/2026 | 2/1/2026 | |
+| 4 | 43 | 2/2/2026 | 2/1/2027 | |
+
+Now you can author a Quality Check stating that **Lease_End** should always be less than **"Next_Lease_Start"** to catch any errors of this type. In fact, Qualytics will automatically infer that check for you at [Level 3 Inference](../../source-datastore/profile.md#levels-of-check-inference){target="_blank"}!
+
+### More Examples for Custom Expression
+
+| **Example** | **Input Fields** | **Custom Expression** | **Output** |
+|-------------|--------------|-----------------|---------------------|
+| 1 | `field1 = 10`, `field2 = 20` | `SUM(field1, field2)` | `30` |
+| 2 | `salary = 50000`, `bonus = 5000` | `salary + bonus` | `55000` |
+| 3 | `hours = 8`, `rate = 15.50` | `hours * rate` | `124` |
+| 4 | `status = 'active'`, `score = 85` | `CASE WHEN status = 'active' THEN score ELSE 0 END` | `85` |
\ No newline at end of file
diff --git a/docs/container/computed-tables-and-files.md b/docs/container/computed-tables-and-files.md
deleted file mode 100644
index f9bd5a4fc3..0000000000
--- a/docs/container/computed-tables-and-files.md
+++ /dev/null
@@ -1,105 +0,0 @@
-# Computed Tables & Files
-
-Computed Tables and Computed Files are powerful virtual tables within the Qualytics platform, each serving distinct purposes in data manipulation. Computed Tables are created using SQL queries on JDBC source datastores, enabling advanced operations like joins and where clauses. Computed Files, derived from Spark SQL transformations on DFS source datastores, allow for efficient data manipulation and transformation directly within the DFS environment.
-
-This guide explains how to add Computed Tables and Computed Files and discusses the differences between them.
-
-Let's get started 🚀
-
-## Computed Tables
-
-Use Computed Tables when you want to perform the following operations on your selected source datastores:
-
-- Data Preparation and Transformation: Clean, shape, and restructure raw data from JDBC source datastores.
-- Complex Calculations and Aggregations: Perform calculations not easily supported by standard containers.
-- Data Subsetting: Extract specific data subsets based on filters using SQL's WHERE clause.
-- Joining Data Across Source Datastores: Combine data from multiple JDBC source datastores using SQL joins.
-
-## Add Computed Tables
-
-**Step 1:** Log in to your Qualytics account and select a JDBC-type source datastore from the side menu on which you would like to add a computed table.
-
-
-
-**Step 2:** After selecting your preferred source datastore, you will be redirected to the source datastore operations page. From this page, click on the **Add** button and select the **Computed Table** option from the dropdown menu.
-
-
-
-**Step 3:** A modal window will appear prompting you to enter a name for your computed table, a valid SQL query that supports your selected source datastore, and optionally, additional metadata.
-
-| REF. | FIELDS | ACTIONS |
-|------|--------|---------|
-| 1. | Name (Required) | Enter a name for your computed table. The name should be descriptive and meaningful to help you easily identify the table later (e.g., add a meaningful name like `Customer_Order_Statistics`). |
-| 2. | Query (Required) | Write a valid SQL query that supports your selected source datastore. The query helps to perform joins and aggregations on your selected source datastore. |
-| 3. | Additional Metadata (Optional) | Add custom metadata to enhance the definition of your computed table. Click the plus icon **(+)** next to this section to open the metadata input form, where you can add key-value pairs. |
-
-
-
-**Step 4:** Click on the **Validate** button to instantly check the syntax and semantics of your SQL query. This ensures your query runs successfully and prevents errors before saving.
-
-
-
-**Step 5:** Once validation is successful, click on the **Save** button to add the computed table to your selected source datastore.
-
-
-
-## Computed Files
-
-Use Computed Files when you want to perform the following operations on your selected source datastore:
-
-- Data Preparation and Transformation: Efficiently clean and restructure raw data stored in a DFS.
-- Column-Level Transformations: Utilize Spark SQL functions to manipulate and clean individual columns.
-- Filtering Data: Extract specific data subsets within a DFS container using Spark SQL's WHERE clause.
-
-## Add Computed Files
-
-**Step 1:** Log in to your Qualytics account and select a DFS-type source datastore from the side menu on which you would like to add a computed file.
-
-
-
-**Step 2:** After clicking on your preferred source datastore, you will be redirected to the source datastore operations page. From this page, click on the **Add** button and select the **Computed File** option from the dropdown menu.
-
-
-
-**Step 3:** A modal window will appear prompting you to enter a name for your computed file, select a source file pattern, choose the expression, and optionally define a filter clause and add additional metadata.
-
-| REF. | FIELDS | ACTION |
-|------|----------------------------|---------------------------------------|
-| 1. | Name (Required) | Enter a name for your computed file. The name should be descriptive and meaningful to help you easily identify the file later (e.g., add a meaningful name like Customer_Order_Statistics). |
-| 2. | Source File Pattern (Required) | Select a source file pattern from the dropdown menu to match files that have a similar naming convention. |
-| 3. | Select Expression (Required) | Select the expression to define the data you want to include in the computed file. |
-| 4. | Filter Clause (Optional) | Add a WHERE clause to filter the data that meets certain conditions. |
-| 5. | Additional Metadata (Optional) | Enhance the computed file definition by setting custom metadata. Click the plus icon **(+)** next to this section to open the metadata input form, where you can add key-value pairs. |
-
-
-
-**Step 4:** Click on the **Validate** button to quickly check your query or expression before saving.
-
-
-
-**Step 5:** Once validation is successful, click on the **Save** button to add the computed file to your selected source datastore.
-
-
-
-After clicking the **Save** button, a success notification appears on the screen showing the action was completed successfully.
-
-## Computed Table Vs. Computed File
-
-| Feature | Computed Table (JDBC) | Computed File (DFS) |
-|---------------------|---------------------------------------|--------------------------------------------|
-| Source Data | JDBC source datastores | DFS source datastores |
-| Query Language | SQL (database-specific functions) | Spark SQL |
-| Supported Operations| Joins, where clauses, and database functions | Column transforms, where clauses (no joins), Spark SQL functions |
-
-!!! note
- Computed tables and files function like regular tables. You can profile them, create checks, and detect anomalies.
-
- - Updating a computed table's query will trigger a profiling operation.
- - Updating a computed file's select or where clause will trigger a profiling operation.
- - When you create a computed table or file, a basic profile of up to 1000 records is automatically generated.
-
-## View Assigned Teams
-
-By hovering over the information icon, users can view the assigned teams for enhanced collaboration and data transparency.
-
-
\ No newline at end of file
diff --git a/docs/container/computed-tables-and-files/computed-files.md b/docs/container/computed-tables-and-files/computed-files.md
new file mode 100644
index 0000000000..84ddd07b58
--- /dev/null
+++ b/docs/container/computed-tables-and-files/computed-files.md
@@ -0,0 +1,39 @@
+# Computed Files
+
+Use Computed Files when you want to perform the following operations on your selected source datastore:
+
+- Data Preparation and Transformation: Efficiently clean and restructure raw data stored in a DFS.
+- Column-Level Transformations: Utilize Spark SQL functions to manipulate and clean individual columns.
+- Filtering Data: Extract specific data subsets within a DFS container using Spark SQL's WHERE clause.
+
+## Add Computed Files
+
+**Step 1:** Log in to your Qualytics account and select a DFS-type source datastore from the side menu on which you would like to add a computed file.
+
+
+
+**Step 2:** After clicking on your preferred source datastore, you will be redirected to the source datastore operations page. From this page, click on the **Add** button and select the **Computed File** option from the dropdown menu.
+
+
+
+**Step 3:** A modal window will appear prompting you to enter a name for your computed file, select a source file pattern, choose the expression, and optionally define a filter clause and add additional metadata.
+
+| REF. | FIELDS | ACTION |
+|------|----------------------------|---------------------------------------|
+| 1 | Name (Required) | Enter a name for your computed file. The name should be descriptive and meaningful to help you easily identify the file later (e.g., add a meaningful name like Customer_Order_Statistics). |
+| 2 | Source File Pattern (Required) | Select a source file pattern from the dropdown menu to match files that have a similar naming convention. |
+| 3 | Select Expression (Required) | Select the expression to define the data you want to include in the computed file. |
+| 4 | Filter Clause (Optional) | Add a WHERE clause to filter the data that meets certain conditions. |
+| 5 | Additional Metadata (Optional) | Enhance the computed file definition by setting custom metadata. Click the plus icon **(+)** next to this section to open the metadata input form, where you can add key-value pairs. |
+
+
+
+**Step 4:** Click on the **Validate** button to quickly check your query or expression before saving.
+
+
+
+**Step 5:** Once validation is successful, click on the **Save** button to add the computed file to your selected source datastore.
+
+
+
+After clicking the **Save** button, a success notification appears on the screen showing the action was completed successfully.
\ No newline at end of file
diff --git a/docs/container/computed-tables-and-files/computed-table-vs-file.md b/docs/container/computed-tables-and-files/computed-table-vs-file.md
new file mode 100644
index 0000000000..c6697bd29b
--- /dev/null
+++ b/docs/container/computed-tables-and-files/computed-table-vs-file.md
@@ -0,0 +1,16 @@
+# Computed Table vs Computed File
+
+Computed Tables and Computed Files both allow you to generate transformed datasets within Qualytics, but they differ in how the output is stored, processed, and consumed. **Computed Tables** produce table-like results inside JDBC datastores and support SQL-based relational operations such as joins. **Computed Files**, on the other hand, generate file-based outputs stored in DFS datastores and support Spark SQL operations suited for large-scale file processing. This comparison helps you choose the right option for your transformation needs.
+
+| **Feature** | **Computed Table (JDBC)** | **Computed File (DFS)** |
+|--------------|----------------------------|---------------------------|
+| Source Data | JDBC source datastores | DFS source datastores |
+| Query Language | SQL (database-specific functions) | Spark SQL |
+| Supported Operations | Joins, where clauses, and database functions | Column transforms, where clauses (no joins), Spark SQL functions |
+
+!!! note
+ Computed tables and files function like regular tables. You can profile them, create checks, and detect anomalies.
+
+ - Updating a computed table’s query will trigger a profiling operation.
+ - Updating a computed file’s select or where clause will trigger a profiling operation.
+ - When you create a computed table or file, a basic profile of up to 1000 records is automatically generated.
\ No newline at end of file
diff --git a/docs/container/computed-tables-and-files/computed-tables.md b/docs/container/computed-tables-and-files/computed-tables.md
new file mode 100644
index 0000000000..d3ff682d4c
--- /dev/null
+++ b/docs/container/computed-tables-and-files/computed-tables.md
@@ -0,0 +1,36 @@
+# Computed Tables
+
+Use Computed Tables when you want to perform the following operations on your selected source datastores:
+
+- Data Preparation and Transformation: Clean, shape, and restructure raw data from JDBC source datastores.
+- Complex Calculations and Aggregations: Perform calculations not easily supported by standard containers.
+- Data Subsetting: Extract specific data subsets based on filters using SQL's WHERE clause.
+- Joining Data Across Source Datastores: Combine data from multiple JDBC source datastores using SQL joins.
+
+## Add Computed Tables
+
+**Step 1:** Log in to your Qualytics account and select a JDBC-type source datastore from the side menu on which you would like to add a computed table.
+
+
+
+**Step 2:** After selecting your preferred source datastore, you will be redirected to the source datastore operations page. From this page, click on the **Add** button and select the **Computed Table** option from the dropdown menu.
+
+
+
+**Step 3:** A modal window will appear prompting you to enter a name for your computed table, a valid SQL query that supports your selected source datastore, and optionally, additional metadata.
+
+| REF. | FIELDS | ACTIONS |
+|------|--------|---------|
+| 1 | Name (Required) | Enter a name for your computed table. The name should be descriptive and meaningful to help you easily identify the table later (e.g., add a meaningful name like `Customer_Order_Statistics`). |
+| 2 | Query (Required) | Write a valid SQL query that supports your selected source datastore. The query helps to perform joins and aggregations on your selected source datastore. |
+| 3 | Additional Metadata (Optional) | Add custom metadata to enhance the definition of your computed table. Click the plus icon **(+)** next to this section to open the metadata input form, where you can add key-value pairs. |
+
+
+
+**Step 4:** Click on the **Validate** button to instantly check the syntax and semantics of your SQL query. This ensures your query runs successfully and prevents errors before saving.
+
+
+
+**Step 5:** Once validation is successful, click on the **Save** button to add the computed table to your selected source datastore.
+
+
\ No newline at end of file
diff --git a/docs/container/computed-tables-and-files/overview.md b/docs/container/computed-tables-and-files/overview.md
new file mode 100644
index 0000000000..bd2438eaa0
--- /dev/null
+++ b/docs/container/computed-tables-and-files/overview.md
@@ -0,0 +1,46 @@
+# Computed Tables & Files
+
+Computed Tables and Computed Files are powerful virtual tables within the Qualytics platform, each serving distinct purposes in data manipulation. Computed Tables are created using SQL queries on JDBC source datastores, enabling advanced operations like joins and where clauses. Computed Files, derived from Spark SQL transformations on DFS source datastores, allow for efficient data manipulation and transformation directly within the DFS environment.
+
+This guide explains how to add Computed Tables and Computed Files and discusses the differences between them.
+
+Let's get started 🚀
+
+## Computed Tables
+
+Use Computed Tables when you want to perform the following operations on your selected source datastores:
+
+!!! note
+ For more information, please refer to the [computed tables documentation](../computed-tables-and-files/computed-tables.md){target="_blank"}.
+
+## Add Computed Tables
+
+**Step 1:** Log in to your Qualytics account and select a JDBC-type source datastore from the side menu on which you would like to add a computed table.
+
+!!! note
+ For more information, please refer to the [add computed tables section](../computed-tables-and-files/computed-tables.md#add-computed-tables){target="_blank"}.
+
+## Computed Files
+
+Use Computed Files when you want to perform the following operations on your selected source datastore:
+
+!!! note
+ For more information, please refer to the [computed files documentation](../computed-tables-and-files/computed-files.md){target="_blank"}.
+
+## Add Computed Files
+
+**Step 1:** Log in to your Qualytics account and select a DFS-type source datastore from the side menu on which you would like to add a computed file.
+
+!!! note
+ For more information, please refer to the [add computed files section](../computed-tables-and-files/computed-files.md#add-computed-files).
+
+## Computed Table Vs. Computed File
+
+!!! note
+ For more information, please refer to the [computed tables vs file documentation](../computed-tables-and-files/computed-table-vs-file.md){target="_blank"}.
+
+## View Assigned Teams
+
+By hovering over the information, icon, users can view the assigned teams for enhanced collaboration and data transparency.
+
+
\ No newline at end of file
diff --git a/docs/container/container-attributes.md b/docs/container/container-attributes.md
new file mode 100644
index 0000000000..c85366178f
--- /dev/null
+++ b/docs/container/container-attributes.md
@@ -0,0 +1,36 @@
+# Container Attributes
+
+### Totals
+
+!!! note
+ Totals are calculated from sampled data, not the full dataset. Values may differ from actual totals across all records.
+
+**1 Quality Score**: This represents the overall health of the data based on various checks. A higher score indicates better data quality and fewer issues detected.
+
+**2 Sampling**: Displays the percentage of data sampled during profiling. A 100% sampling rate means the entire dataset was analyzed for the quality report.
+
+**3 Completeness**: Indicates the percentage of records that are fully populated without missing or incomplete data. Lower percentages may suggest that some fields have missing values.
+
+**4 Records Profiled**: Shows the number or percentage of records that have been analyzed during the profiling process.
+
+**5 Fields Profiled**: This shows the number of fields or attributes within the dataset that have undergone data profiling, which helps identify potential data issues in specific columns.
+
+**6 Active Checks**: Represents the number of ongoing checks applied to the dataset. These checks monitor data quality, consistency, and correctness.
+
+**7 Active Anomalies**: Displays the total number of anomalies found during the data profiling process. Anomalies can indicate inconsistencies, outliers, or potential data quality issues that need resolution.
+
+
+
+### Observability
+
+**1 Volumetric Measurement**
+
+Volumetric measurement allows users to track the size of data stored within the table over time. This helps in monitoring how the data grows or changes, making it easier to detect sudden spikes that may impact system performance. Users can visualize data volume trends and manage the table's efficiency. This helps in optimizing storage, adjusting resource allocation, and improving query performance based on the size and growth of the computed table.
+
+
+
+**2 Anomalies Measurement**
+
+The **Anomalies** section helps users track any unusual data patterns or issues within the computed tables. It shows a visual representation of when anomalies occurred over a specific time period, making it easy to spot unusual activity. This allows users to quickly identify when something might have gone wrong and take action to fix it, ensuring the data stays accurate and reliable.
+
+
\ No newline at end of file
diff --git a/docs/container/container-types.md b/docs/container/container-types.md
new file mode 100644
index 0000000000..53cfcc6132
--- /dev/null
+++ b/docs/container/container-types.md
@@ -0,0 +1,11 @@
+# Container Types
+
+There are two main types of containers in Qualytics:
+
+## JDBC Container
+
+JDBC containers are virtual representations of database objects, making it easier to work with data stored in relational databases. These containers include tables, which organize data into rows and columns like a spreadsheet, views that provide customized displays of data from one or more tables, and other database objects such as indexes or stored procedures. Acting as a bridge between applications and databases, JDBC enables seamless interaction with these containers, allowing efficient data management and retrieval.
+
+## DFS Container
+
+DFS containers are used to represent files stored in distributed file systems, such as Hadoop or cloud storage. These files can include formats like CSV, JSON, or Parquet, which are commonly used for storing and organizing data. DFS containers make it easier for applications to work with these files by providing a structured way to access and process data in large-scale storage systems.
diff --git a/docs/container/container.md b/docs/container/container.md
deleted file mode 100644
index fbe51e608a..0000000000
--- a/docs/container/container.md
+++ /dev/null
@@ -1,164 +0,0 @@
-# Containers Overview
-
-Containers are fundamental entities representing structured data sets. These containers could manifest as tables in JDBC datastores or as files within DFS datastores. They play a pivotal role in data organization, profiling, and quality checks within the Qualytics application.
-
-Let’s get started 🚀
-
-## Container Types
-
-There are two main types of containers in Qualytics:
-
-### JDBC Container
-
-JDBC containers are virtual representations of database objects, making it easier to work with data stored in relational databases. These containers include tables, which organize data into rows and columns like a spreadsheet, views that provide customized displays of data from one or more tables, and other database objects such as indexes or stored procedures. Acting as a bridge between applications and databases, JDBC enables seamless interaction with these containers, allowing efficient data management and retrieval.
-
-### DFS Container
-
-DFS containers are used to represent files stored in distributed file systems, such as Hadoop or cloud storage. These files can include formats like CSV, JSON, or Parquet, which are commonly used for storing and organizing data. DFS containers make it easier for applications to work with these files by providing a structured way to access and process data in large-scale storage systems.
-
-## Container Attributes
-
-### Totals
-
-!!! note
- Totals are calculated from sampled data, not the full dataset. Values may differ from actual totals across all records.
-
-1. **Quality Score**: This represents the overall health of the data based on various checks. A higher score indicates better data quality and fewer issues detected.
-
-2. **Sampling**: Displays the percentage of data sampled during profiling. A 100% sampling rate means the entire dataset was analyzed for the quality report.
-
-3. **Completeness**: Indicates the percentage of records that are fully populated without missing or incomplete data. Lower percentages may suggest that some fields have missing values.
-
-4. **Records Profiled**: Shows the number or percentage of records that have been analyzed during the profiling process.
-
-5. **Fields Profiled**: This shows the number of fields or attributes within the dataset that have undergone data profiling, which helps identify potential data issues in specific columns.
-
-6. **Active Checks**: Represents the number of ongoing checks applied to the dataset. These checks monitor data quality, consistency, and correctness.
-
-7. **Active Anomalies**: Displays the total number of anomalies found during the data profiling process. Anomalies can indicate inconsistencies, outliers, or potential data quality issues that need resolution.
-
-
-
-### Observability
-
-**1. Volumetric Measurement**
-
-Volumetric measurement allows users to track the size of data stored within the table over time. This helps in monitoring how the data grows or changes, making it easier to detect sudden spikes that may impact system performance. Users can visualize data volume trends and manage the table's efficiency. This helps in optimizing storage, adjusting resource allocation, and improving query performance based on the size and growth of the computed table.
-
-
-
-**2. Anomalies Measurement**
-
-The **Anomalies** section helps users track any unusual data patterns or issues within the computed tables. It shows a visual representation of when anomalies occurred over a specific time period, making it easy to spot unusual activity. This allows users to quickly identify when something might have gone wrong and take action to fix it, ensuring the data stays accurate and reliable.
-
-
-
-## Actions on Container
-
-Users can perform various operations on containers to manage datasets effectively. The actions are divided into three main sections: **Settings**, **Add**, and **Run**. Each section contains specific options to perform different tasks.
-
-
-
-### Settings
-
-The **Settings** button allows users to configure the container. By clicking on the **Settings** button, users can access the following options:
-
-
-
-| No. | Options | Description |
-| :---- | :---- | :---- |
-| **1.** | Settings | Configure incremental strategy, partitioning fields, and exclude specific fields from analysis. |
-| **2.** | Score | Score allowing you to adjust the decay period and factor weights for metrics like completeness, accuracy, and consistency. |
-| **3.** | Observability | Enables or disables tracking for data volume and freshness.
**Volume Tracking:** Monitors daily volume metrics to identify trends and detect anomalies over time.
**Freshness Tracking:** Records the last update timestamp to ensure data timeliness and detect pipeline delays. |
-| **4.** | Migrate | Migrate authored quality checks from one container to another (even across datastores) to quickly reuse, standardize, and avoid recreating rules. |
-| **5.** | Export | Export quality checks, field profiles, and anomalies to an enrichment datastore for further action or analysis. |
-| **6.** | Materialize | Captures snapshots of data from a source datastore and exports it to an enrichment datastore for faster access and analysis. |
-| **7.** | Delete | Delete the selected container from the system. |
-
-### Add
-
-The **Add** button allows users to add checks or computed fields. By clicking on the **Add** button, users can access the following options:
-
-
-
-| No. | Options | Description |
-| :---- | :---- | :---- |
-| **1.** | Checks | Checks allow you to add new checks or validation rules for the container. |
-| **2.** | Computed Field | Allows you to add a computed field. |
-
-### Run
-
-The **Run** button provides options to execute operations on datasets, such as profiling, scanning, and external scans. By clicking on the **Run** button, users can access the following options:
-
-
-
-| No. | Options | Descriptions |
-| :---- | :---- | :---- |
-| **1.** | Profile | **Profile** allows you to run a profiling operation to analyze the data structure, gather metadata, set thresholds, and define record limits for comprehensive dataset profiling. |
-| **2.** | Scan | **Scan** allows you to perform data quality checks, configure scan strategies, and detect anomalies in the dataset. |
-| **3.** | External Scan | **External Scan** allows you to upload a file and validate its data against predefined checks in the selected table. |
-
-## Field Profiles
-
-After profiling a container, individual field profiles offer granular insights:
-
-### Totals
-
-**1. Quality Score**: This provides a comprehensive assessment of the overall health of the data, factoring in multiple checks for accuracy, consistency, and completeness. A higher score, closer to 100, indicates optimal data quality with minimal issues or errors detected. A lower score may highlight areas that require attention and improvement.
-
-**2. Sampling**: This shows the percentage of data that was evaluated during profiling. A sampling rate of 100% indicates that the entire dataset was analyzed, ensuring a complete and accurate representation of the data’s quality across all records, rather than just a partial sample.
-
-**3. Completeness**: This metric measures how fully the data is populated without missing or null values. A higher completeness percentage means that most fields contain the necessary information, while a lower percentage indicates data gaps that could negatively impact downstream processes or analysis.
-
-**4. Active Checks**: This refers to the number of ongoing quality checks being applied to the dataset. These checks monitor aspects such as format consistency, uniqueness, and logical correctness. Active checks help maintain data integrity and provide real-time alerts about potential issues that may arise.
-
-**5. Active Anomalies**: This tracks the number of anomalies or irregularities detected in the data. These could include outliers, duplicates, or inconsistencies that deviate from expected patterns. A count of zero indicates no anomalies, while a higher count suggests that further investigation is needed to resolve potential data quality issues.
-
-
-
-### Profile
-
-This provides detailed insights into the characteristics of the field, including its type, distinct values, and length. You can use this information to evaluate the data's uniqueness, length consistency, and complexity.
-
-| No | Profile | Description |
-| :---- | :---- | :---- |
-| 1 | Declared Type | Indicates whether the type is declared by the source or inferred. |
-| 2 | Distinct Values | Count of distinct values observed in the dataset. |
-| 3 | Min Length | Shortest length of the observed string values or lowest value for numerics. |
-| 4 | Max Length | Greatest length of the observed string values or highest value for numerics. |
-| 5 | Mean | Mathematical average of the observed numeric values. |
-| 6 | Median | The median of the observed numeric values. |
-| 7 | Standard Deviation | Measure of the amount of variation in observed numeric values. |
-| 8 | Kurtosis | Measure of the ‘tailedness’ of the distribution of observed numeric values. |
-| 9 | Skewness | Measure of the asymmetry of the distribution of observed numeric values. |
-| 10 | Q1 | The first quartile; the central point between the minimum and the median. |
-| 11 | Q3 | The third quartile; the central point between the median and the maximum. |
-| 12 | Sum | Total sum of all observed numeric values. |
-
-
-
-#### Last Profile
-
-The **Last Profile** timestamp helps users understand how up-to-date the field is. When you hover over the time indicator shown on the right side of the Last Profile label (e.g., "1 week ago"), a tooltip displays the complete date and time the field was last profiled.
-
-
-
-This visibility ensures better context for interpreting profile metrics like mean, completeness, and anomalies.
-
-#### Compare Profile
-
-You can compare the current field profile with earlier versions to spot changes over time. Visual indicators highlight modified metrics, interactive charts show numeric trends across profile history, and special badges identify data drift or field type changes.
-
-By clicking on the dropdown under **Compare With**, you can select an earlier profile run (for example, 1 day ago or 5 days ago).
-
-
-
-Once selected, the system highlights differences between profiles, marking metrics as **Changed** or **Unchanged**. It compares data quality **(Sampling, Completeness)** and statistical measures **(mean, median, standard deviation, skewness, kurtosis, min, max, distinct values, etc.)**, making it easy to track shifts in data quality and distribution.
-
-
-
-#### View Metric Chart
-
-You can access detailed metric charts by clicking the **View Metric Chart** button. This will display variations across the last 10 profiles. By hovering over points on the chart, you can see additional details such as profile dates, measured values, and sampling percentages for deeper analysis.
-
-
diff --git a/docs/container/export-operation.md b/docs/container/enrichment-operation/export-operation.md
similarity index 59%
rename from docs/container/export-operation.md
rename to docs/container/enrichment-operation/export-operation.md
index 509127939c..1d67573d22 100644
--- a/docs/container/export-operation.md
+++ b/docs/container/enrichment-operation/export-operation.md
@@ -4,9 +4,9 @@ Qualytics metadata export feature lets you capture the changing states of your d
To keep things organized, the exported files use specific naming patterns:
-* **Anomalies:** Saved as `__anomalies_export`.
-* **Quality Checks:** Saved as `__checks_export`.
-* **Field Profiles:** Saved as `__field_profiles_export`.
+* **Anomalies:** Saved as `__anomalies_export`
+* **Quality Checks:** Saved as `__checks_export`
+* **Field Profiles:** Saved as `__field_profiles_export`
!!! note
Ensure that an enrichment datastore is already set up and properly configured to accommodate the exported data. This setup is essential for exporting anomalies, quality checks, and field profiles successfully.
@@ -15,45 +15,37 @@ Let’s get started 🚀
**Step 1:** Select a source datastore from the side menu from which you would like to export the metadata.
-
-
+
For demonstration purposes, we have selected the **“COVID-19 Data”** Snowflake source datastores.
-
-
+
**Step 2:** After selecting a datastore, a bottom-up menu appears on the right side of the interface. Click **Enrichment Operations** next to the Enrichment Datastore and select **Export**.
-
-
+
-**Step 3:** After clicking **Export**, the **Export Operation** modal window appears, allowing metadata extraction from the selected source datastore to the enrichment datastore.
+**Step 3:** After clicking **Export**, the **Export Operation** modal window appears, allowing you to extract metadata from the selected source datastore to the enrichment datastore.
-
-
+
-**Step 4:** Select the tables you wish to export. **All**, **Specific**, or **Tag** and click **Next** to proceed.
+**Step 4:** Select the tables you wish to export: **All**, **Specific**, or **Tag** and click **Next** to proceed.
-
-
+
**Step 5:** After clicking **Next**, select the assets you want to export to your Enrichment Datastore: Anomalies, Quality Checks, or Field Profiles, and click **Export** to proceed with the export process.
-
-
+
After clicking **Export**, a confirmation message appears stating **"Export in motion."** In a couple of minutes, the metadata will be available in your Enrichment Datastore.
-
-
+
## Schedule Operation
**Step 1:** Click **Schedule** to configure scheduling options for the Export Operation.
-
-
+
**Step 2:** Configure the scheduling preferences for the Export Operation.
@@ -67,40 +59,33 @@ After clicking **Export**, a confirmation message appears stating **"Export in m
* **Advanced:** Use Cron expressions for custom scheduling. (e.g., `0 12 * * 1-5` runs at 12 PM, Monday to Friday).
-
-
+
**Step 3:** Define the Schedule Name to identify the scheduled Export Operation when it runs.
-
-
+
**Step 4:** Click **Schedule** to finalize and schedule the Export Operation.
-
-
+
-After clicking **Schedule**, a confirmation message appears stating **"Operation Scheduled"**. Go to the Activity tab to see the progress of export operation.
+After clicking **Schedule**, a confirmation message appears stating **"Operation Scheduled"**. Go to the Activity tab to see the progress of the export operation.
-
-
+
## Review Exported Data
**Step 1:** Once the metadata has been exported, navigate to the **“Enrichment Datastores”** located on the left menu.
-
-
+
**Step 2:** In the **“Enrichment Datastores”** section, select the datastore where you exported the metadata. The exported metadata will now be visible in the selected datastore.
-
-
+
**Step 3:** Click on the exported files to view the metadata. For demonstration purposes, we have selected the **“export_field_profiles”** file to review the metadata.
The exported metadata is displayed in a table format, showing key details about the field profiles from the datastore. It typically includes columns that indicate the uniqueness of data, the completeness of the fields, and the data structure. You can use this metadata to check data quality, prepare for analysis, ensure compliance, and manage your data.
-
-
+
\ No newline at end of file
diff --git a/docs/container/materialize-operation.md b/docs/container/enrichment-operation/materialize-operation.md
similarity index 72%
rename from docs/container/materialize-operation.md
rename to docs/container/enrichment-operation/materialize-operation.md
index ce6b3f465e..9e8116439f 100644
--- a/docs/container/materialize-operation.md
+++ b/docs/container/enrichment-operation/materialize-operation.md
@@ -10,7 +10,7 @@ To keep materialized data organized and compatible across different enrichment d
Used when the container name is safe to use as-is.
-`_mat_`.
+`_mat_`
This naming format is applied when:
@@ -21,13 +21,13 @@ This naming format is applied when:
If the enrichment prefix is `sales` and the container name is `orders_2024`:
-`sales_mat_orders_2024`.
+`sales_mat_orders_2024`
### Fallback Naming Convention
If the container name contains characters that may cause issues in downstream systems, the system switches to a safer naming structure by using the **container ID** instead.
-`_materialize_`.
+`_materialize_`
This fallback is used when:
@@ -39,7 +39,7 @@ This fallback is used when:
If the enrichment prefix is `sales` and the container ID is `1023456`:
-`sales_materialize_1023456`.
+`sales_materialize_1023456`
!!! note
The fallback naming ensures successful loading into the enrichment datastore by preventing invalid or non-compliant table names.
@@ -48,43 +48,43 @@ Let’s get started 🚀
**Step 1:** Select a source datastore from the side menu to capture and export containers for the Materialize Operation.
-
+
For demonstration purposes, we have selected the **“COVID-19 Data”** Snowflake source datastore.
-
+
**Step 2:** After selecting a datastore, a bottom-up menu appears on the right side of the interface. Click **Enrichment Operations** next to the Enrichment Datastore and select **Materialize**.
-
+
-**Step 3:** After clicking **Materialize** a modal window appears, allowing you to configure the data export settings for the Materialize Operation.
+**Step 3:** After clicking **Materialize**, a modal window appears, allowing you to configure the data export settings for the Materialize Operation.
-
+
-**Step 4:** Select tables to materialize all tables, specific tables, or tables by tag, then click **Next**.
+**Step 4:** Select tables to materialize all tables, specific tables, or tables by tag, then click **Next**.
-
+
-**Step 5:** Configure Record Limit: set the maximum number of records to be materialized per table.
+**Step 5:** Configure Record Limit: Set the maximum number of records to be materialized per table.
-
+
## Run Now
Click **Run Now** to instantly materialize selected containers.
-
+
After clicking **Run Now**, a confirmation message appears stating **"Operation Triggered"**. Go to the Activity tab to see the progress of materialize operation.
-
+
## Schedule
**Step 1:** Click **Schedule** to configure scheduling options for the Materialize Operation.
-
+
**Step 2:** Configure the scheduling preferences for the Materialize Operation.
@@ -98,32 +98,32 @@ After clicking **Run Now**, a confirmation message appears stating **"Operation
* **Advanced:** Use Cron expressions for custom scheduling. (e.g., `0 12 * * 1-5` runs at 12 PM, Monday to Friday).
-
+
**Step 3:** Define the Schedule Name to identify the scheduled Materialize Operation when it runs.
-
+
**Step 4:** Click **Schedule** to finalize and schedule the Materialize Operation.
-
+
After clicking **Schedule**, a confirmation message appears stating **"Operation Scheduled"**. Go to the Activity tab to see the progress of materialize operation.
-
+
## Review Materialized Data
**Step 1:** Once the selected containers are materialized, go to **Enrichment Datastores** from the left menu.
-
+
**Step 2:** In the **Enrichment Datastores** section, select the datastore where you materialized the snapshot. The materialized containers will now be visible.
-
+
**Step 3:** Click on the materialized files to review the snapshot. For demonstration, we have selected the **"materialized_field_profiles"** file.
The materialized data is displayed in a table format, showing key details about the selected containers. It typically includes columns indicating data structure, completeness, and uniqueness. You can use this data for analysis, validation, and integration.
-
\ No newline at end of file
+
\ No newline at end of file
diff --git a/docs/container/field-profiles.md b/docs/container/field-profiles.md
new file mode 100644
index 0000000000..0c503c2a04
--- /dev/null
+++ b/docs/container/field-profiles.md
@@ -0,0 +1,86 @@
+# Field Profiles
+
+A Field Profile provides a detailed breakdown of a field’s data after a profiling operation. It helps you understand the structure, quality, and distribution of values for each field inside a container.
+
+## What Field Profiles Are Used For
+
+Field Profiles help you:
+
+- Validate the completeness, consistency, and quality of individual fields
+- Identify unexpected patterns such as outliers, skewed values, or sudden changes
+- Compare current field behavior with previous profiling runs
+- Support quality checks by exposing metrics like distinct values, min/max length, and statistical indicators
+
+These insights make it easier to detect data issues early and understand how a field behaves over time.
+
+## How Field Profiles Are Generated
+
+Field Profiles are automatically created when you run a [**Profile**](../source-datastore/profile.md){target="_blank"} operation on a container:
+
+1. Qualytics scans the dataset and evaluates each field.
+2. It computes metrics such as declared type, distinct counts, distribution statistics, sampling, and completeness.
+3. The platform stores these results as the latest profile.
+4. Each time you re-profile the container, the Field Profile is updated, allowing you to compare current metrics with previous runs.
+
+This ensures your field-level insights remain current and can be tracked across multiple profiling sessions.
+
+## Totals
+
+**1 Quality Score**: This provides a comprehensive assessment of the overall health of the data, factoring in multiple checks for accuracy, consistency, and completeness. A higher score, closer to 100, indicates optimal data quality with minimal issues or errors detected. A lower score may highlight areas that require attention and improvement.
+
+**2 Sampling**: This shows the percentage of data that was evaluated during profiling. A sampling rate of 100% indicates that the entire dataset was analyzed, ensuring a complete and accurate representation of the data’s quality across all records, rather than just a partial sample.
+
+**3 Completeness**: This metric measures how fully the data is populated without missing or null values. A higher completeness percentage means that most fields contain the necessary information, while a lower percentage indicates data gaps that could negatively impact downstream processes or analysis.
+
+**4 Active Checks**: This refers to the number of ongoing quality checks being applied to the dataset. These checks monitor aspects such as format consistency, uniqueness, and logical correctness. Active checks help maintain data integrity and provide real-time alerts about potential issues that may arise.
+
+**5 Active Anomalies**: This tracks the number of anomalies or irregularities detected in the data. These could include outliers, duplicates, or inconsistencies that deviate from expected patterns. A count of zero indicates no anomalies, while a higher count suggests that further investigation is needed to resolve potential data quality issues.
+
+
+
+## Profile
+
+This provides detailed insights into the characteristics of the field, including its type, distinct values, and length. You can use this information to evaluate the data's uniqueness, length consistency, and complexity.
+
+| No. | Profile | Description |
+| :---- | :---- | :---- |
+| 1 | Declared Type | Indicates whether the type is declared by the source or inferred. |
+| 2 | Distinct Values | Count of distinct values observed in the dataset. |
+| 3 | Min Length | Shortest length of the observed string values or lowest value for numerics. |
+| 4 | Max Length | Greatest length of the observed string values or highest value for numerics. |
+| 5 | Mean | Mathematical average of the observed numeric values. |
+| 6 | Median | The median of the observed numeric values. |
+| 7 | Standard Deviation | Measure of the amount of variation in observed numeric values. |
+| 8 | Kurtosis | Measure of the ‘tailedness’ of the distribution of observed numeric values. |
+| 9 | Skewness | Measure of the asymmetry of the distribution of observed numeric values. |
+| 10 | Q1 | The first quartile; the central point between the minimum and the median. |
+| 11 | Q3 | The third quartile; the central point between the median and the maximum. |
+| 12 | Sum | Total sum of all observed numeric values. |
+
+
+
+### Last Profile
+
+The **Last Profile** timestamp helps users understand how up-to-date the field is. When you hover over the time indicator shown on the right side of the Last Profile label (e.g., "1 week ago"), a tooltip displays the complete date and time the field was last profiled.
+
+
+
+This visibility ensures better context for interpreting profile metrics like mean, completeness, and anomalies.
+
+### Compare Profile
+
+You can compare the current field profile with earlier versions to spot changes over time. Visual indicators highlight modified metrics, interactive charts show numeric trends across profile history, and special badges identify data drift or field type changes.
+
+By clicking on the dropdown under **Compare With**, you can select an earlier profile run (for example, 1 day ago or 5 days ago).
+
+
+
+Once selected, the system highlights differences between profiles, marking metrics as **Changed** or **Unchanged**. It compares data quality **(Sampling, Completeness)** and statistical measures **(mean, median, standard deviation, skewness, kurtosis, min, max, distinct values, etc.)**, making it easy to track shifts in data quality and distribution.
+
+
+
+### View Metric Chart
+
+You can access detailed metric charts by clicking the **View Metric Chart** button. This will display variations across the last 10 profiles. By hovering over points on the chart, you can see additional details such as profile dates, measured values, and sampling percentages for deeper analysis.
+
+
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files.md b/docs/container/manage-tables-and-files.md
deleted file mode 100644
index dc57763f09..0000000000
--- a/docs/container/manage-tables-and-files.md
+++ /dev/null
@@ -1,329 +0,0 @@
-# Manage Tables & Files
-
-Managing **JDBC “tables”** and **DFS “files”** in a connected **source datastore** allows you to perform actions such as adding validation checks, running scans, monitoring data changes, exporting, or deleting them. For JDBC tables, you can also handle metadata, configure partitions, and manage incremental data for optimized processing. However, for DFS datastores, the default incremental field is the file’s last modified timestamp, and users cannot configure incremental or partition fields manually.
-
-Let’s get started 🚀
-
-### Navigation
-
-**Step 1:** Log in to your Qualytics account and select the source datastore (JDBC or DFS) from the left menu that you want to manage.
-
-
-
-
-**Step 2:** Select Tables (if JDBC datastore is connected) or File Patterns (if DFS datastore is connected) from the Navigation tab on the top.
-
-
-
-
-**Step 3:** You will view the full list of tables or files belonging to the selected source datastore.
-
-
-
-
-## Settings For JDBC Table
-
-Settings allow you to edit how data is processed and analyzed for a specific table in your connected source datastore. This includes selecting fields for incremental and partitioning strategies, grouping data, excluding certain fields from scans, and adjusting general behaviors.
-
-**Step 1:** Click on the vertical ellipse next to the table of your choice and select **Settings** from the dropdown list.
-
-
-
-
-A modal window will appear for **“Table Settings”**.
-
-
-
-
-**Step 2:** Modify the table setting based on:
-
-- Identifiers
-
-- Group Criteria
-
-- Excluding
-
-- General
-
-### Identifiers
-
-An **Identifier** is a field that can be used to help load the desired data from a Table in support of analysis. For more details about identifiers, you can refer to the documentation on [Identifiers.](https://userguide.qualytics.io/container/overview-of-identifiers/)
-
-#### Incremental Strategy
-
-This is crucial for tracking changes at the row level within tables. This approach is essential for efficient data processing, as it is specifically used to track which records have already been scanned. This allows for scan operations to focus exclusively on new records that have not been previously scanned, thereby optimizing the scanning process and ensuring that only the most recent and relevant data is analyzed.
-
-!!! note
- If you have connected a DFS datastore, no manual setup is needed for the incremental strategy, the system automatically tracks and processes the latest data changes.
-
-
-
-
-For information about incremental strategy, you can refer to the [Incremental Strategy](https://userguide.qualytics.io/container/overview-of-identifiers/#partition-field) section in the Identifiers documentation.
-
-#### Incremental Field
-
-**Incremental Field** lets you select a field that tracks changes in your data. This ensures only new or updated records are scanned, improving efficiency and reducing unnecessary processing.
-
-
-
-
-#### Partition Field
-
-**Partition Field** is used to divide the data in a table into distinct segments, or dataframes. These partitions allow for parallel analysis, improving efficiency and performance. By splitting the data, each partition can be processed independently. This approach helps optimize large-scale data operations.
-
-
-
-
-For information about **Partition Field**, you can refer to the [Partition Field](https://userguide.qualytics.io/container/overview-of-identifiers/#partition-field) section in the Identifiers documentation.
-
-### Group Criteria
-
-**Group Criteria** allow you to organize data into specific groups for more precise analysis. By grouping fields, you can gain better insights and enhance the accuracy of your profiling.
-
-
-
-
-For information about **Group Criteria**, you can refer to the documentation on [Grouping.](https://userguide.qualytics.io/container/overview-of-grouping/)
-
-### Excluding
-
-**Excluding** allows you to choose specific fields from a table that you want to exclude from data checks. This helps focus on the fields that matter most for validation while ignoring others that are not relevant to the current analysis.
-
-
-
-
-For information about **Excluding**, you can refer to the documentation on [Excluding Settings.](https://userguide.qualytics.io/container/overview-of-infer-data-type/#excluded-fields)
-
-### General
-
-You can control the default behavior of the specific table by checking or unchecking the option to infer the data type for each field. When checked, the system will automatically determine and cast the data types as needed for accurate data processing.
-
-
-
-
-For information about **General**, you can refer to the documentation on [General Settings.](https://userguide.qualytics.io/container/overview-of-infer-data-type/)
-
-**Step 3:** Once you have configured the table settings, click on the **Save** button.
-
-
-
-
-After clicking on the **Save** button, your table is successfully updated and a success flash message will appear stating **"Table has been successfully updated"**.
-
-
-
-
-## Settings For DFS Files Pattern
-
-Settings allow you to edit how data is processed and analyzed for a specific file patterns in your connected source datastore. This includes selecting fields for incremental and partitioning strategies, grouping data, excluding certain fields from scans, and adjusting general behaviors.
-
-**Step 1:** Click on the vertical ellipse next to the file pattern of your choice and select **Settings** from the dropdown list.
-
-
-
-
-A modal window will appear for **“File Pattern Settings”**.
-
-
-
-
-**Step 2:** Modify the table setting based on:
-
-- Group Criteria
-
-- Excluding
-
-- General
-
-### Group Criteria
-
-**Group Criteria** allow you to organize data into specific groups for more precise analysis. By grouping fields, you can gain better insights and enhance the accuracy of your profiling.
-
-
-
-
-For information about **Group Criteria**, you can refer to the documentation on [Grouping.](../container/overview-of-grouping.md)
-
-### Excluding
-
-**Excluding** allows you to choose specific fields from a file pattern that you want to exclude from data checks. This helps focus on the fields that matter most for validation while ignoring others that are not relevant to the current analysis.
-
-
-
-
-For information about **Excluding**, you can refer to the documentation on [Excluding Settings.](../container/overview-of-infer-data-type.md#excluding-fields)
-
-### General
-
-You can control how file patterns behave by checking or unchecking options to make data processing easier and more consistent. These settings help the system automatically adjust file structures for better integration and analysis.
-
-
-
-
-* **Inferring Data Types:** When enabled, the system figures out the correct data type for each field and applies it automatically. This keeps data consistent and reduces errors, saving you time on manual fixes.
-
-
-
-
-* **First Row as Field Names:** Turning this on uses the first row of a file as headers, making it simple to map and organize data in the right format.
-
-
-
-
-* **Treating Empty Values as Nulls:** The Treat empty values as null setting controls how empty fields in files like Excel and CSV are handled. If enabled (true), empty fields are treated as NULL (missing data). If disabled (false), they are stored as empty strings (""), meaning the field exists but is blank. This affects reporting, calculations, and data processing, as NULL values are ignored while empty strings may still be counted.
-
-
-
-
-**Step 3:** Once you have configured the file pattern settings, click on the **Save** button.
-
-
-
-
-After clicking on the **Save** button, your table is successfully updated and a success flash message will appear stating **"File Pattern has been successfully updated"**.
-
-
-
-
-## Add Checks
-
-**Add Check** allows you to create rules to validate the data within a particular table. You can choose the type of rule, link it directly to the selected table, and add descriptions or tags. This ensures that the table's data remains accurate and compliant with the required standards.
-
-**Step 1:** Click on the vertical ellipse next to the table name and select **Add Checks**.
-
-
-
-
-A modal window will appear to add checks against the selected table.
-
-
-
-
-To understand how to add checks, you can follow the remaining steps from the documentation [Checks Template.](https://userguide.qualytics.io/checks/checks-template/)
-
-## Run
-
-Execute various operations like profiling or scanning your table or file. It helps validate data quality and ensures that the table meets the defined checks and rules, providing insights into any anomalies or data issues that need attention.
-
-**Step 1:** Click on the vertical ellipse next to the table name and select **Run**.
-
-
-
-
-Under **Run**, choose the type of operation you want to perform:
-
-- **Profile**: To collect metadata and profile the table's contents.
-
-- **Scan**: To validate the data against defined rules and checks.
-
-
-
-
-To understand how a profile operation is performed, you can follow the remaining steps from the documentation [Profile Operation.](https://userguide.qualytics.io/source-datastore/profile/#configuration).
-
-To understand how a scan operation is performed, you can follow the remaining steps from the documentation [Scan Operation.](https://userguide.qualytics.io/source-datastore/scan/#configuration)
-
-## Observability Settings
-
-Observability helps you track and monitor data performance in your connected source datastore’s tables and files. It provides insights into data volume, detects anomalies, and ensures smooth data processing by identifying potential issues early. This makes it easier to manage and maintain data quality over time.
-
-**Step 1:** Select the table in your JDBC datastore that you would like to monitor, then click on **Observability.**
-
-
-
-
-A modal window **“Observability Settings”** will appear. Here you can view the details of the table and datastore where actions have been applied.
-
-
-
-
-**Step 2:** Check the "**Volume Tracking**" to enable trend analysis and anomaly detection in data volumes over time and check the "**Freshness Tracking**" to ensure data timeliness and to identify pipeline delays.
-
-**Volume Tracking** monitors and records daily volume metrics for this data asset. This feature enables trend analysis and anomaly detection in data volumes over time. **Freshness Tracking** measures and records the last time data was added or updated in the data asset. This feature helps ensure data timeliness and identifies pipeline delays.
-
-
-
-
-**Step 3:** Click on the **Save** button.
-
-
-
-
-After clicking on the Save button, a success flash message will appear stating **"Profile has been successfully updated"**.
-
-
-
-
-## Export
-
-**Export feature** lets you capture changes in your tables. You can export metadata for Quality Checks, Field Profiles, and Anomalies from selected tables to an enrichment datastore. This helps you analyze data trends, find issues, and make better decisions based on the table data.
-
-**Step 1:** Select the tables in your JDBC datastore that you would like to export, then click on **Export**.
-
-
-
-
-A modal window will appear with the **Export Operation** setting.
-
-
-
-
-For the next steps, detailed information on the export operation is available in the [Export Operation](https://userguide.qualytics.io/container/export-operation/) section of the documentation.
-
-## Materialize
-
-**Materialize Operation** captures snapshots of selected containers from a source datastore and exports them to an enrichment datastore for seamless data loading. Users can run it instantly or schedule it at set intervals, ensuring structured data is readily available for analysis and integration.
-
-**Step 1:** Select the tables in your JDBC datastore that you would like to capture and export containers for the Materialize Operation, then click on **Materialize**.
-
-
-
-
-A modal window will appear with the **Materialize Operation** setting.
-
-
-
-
-For the next steps, detailed information on the materialize operation is available in the [Materialize Operation](https://userguide.qualytics.io/container/materialize-operation/) section of the documentation.
-
-## Delete
-
-**Delete** allows you to remove a table from the connected source datastore. While the table and its associated data will be deleted, it is not permanent, as the table can be recreated if you run a catalog with the "recreate" option.
-
-!!! note
- Deleting a table is a reversible action if a catalog with the "recreate" option is run later.
-
-**Step 1:** Select the tables in your connected source datastore that you would like to delete, then click on **Delete**.
-
-
-
-
-**Step 2:** A confirmation modal window will appear, click on the Delete button to remove the table from the system.
-
-
-
-
-**Step 3:** After clicking on the delete button, your table is successfully deleted and a success flash message will appear saying **"Profile has been successfully deleted"**
-
-
-
-
-## Mark Tables & Files as Favorite
-
-Marking a tables and files as a favorite allows you to quickly access important items. This feature helps you prioritize and manage the tables and files you use frequently, making data management more efficient.
-
-**Step 1**: Locate the table and file you want to mark as a favorite and click on the bookmark icon to mark the table and file as a favorite.
-
-
-
-
-After Clicking on the bookmark icon your table and file is successfully marked as a favorite and a success flash message will appear stating “The Table has been favorited”.
-
-
-
-
-**Step 2**: To unmark a tables and files, simply click on the bookmark icon of the marked tables and files. This will remove it from your favorites.
-
-
-
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/add-checks.md b/docs/container/manage-tables-and-files/add-checks.md
new file mode 100644
index 0000000000..d5dd3382db
--- /dev/null
+++ b/docs/container/manage-tables-and-files/add-checks.md
@@ -0,0 +1,17 @@
+# Add Checks
+
+**Add Check** allows you to create rules to validate the data within a particular table. You can choose the type of rule, link it directly to the selected table, and add descriptions or tags. This ensures that the table's data remains accurate and compliant with the required standards.
+
+!!! info
+ For more information about checks, please refer to the [checks documentation](../../datastore-checks/checks-datastore.md){target="_blank"}.
+
+**Step 1:** Click on the vertical ellipse next to the table name and select **Add Checks**.
+
+
+
+A modal window will appear to add checks against the selected table.
+
+
+
+!!! note
+ To understand how to add checks, you can follow the remaining steps from the documentation [Checks Template](../../checks/checks-template.md){target="_blank"}.
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/delete.md b/docs/container/manage-tables-and-files/delete.md
new file mode 100644
index 0000000000..89d38b8846
--- /dev/null
+++ b/docs/container/manage-tables-and-files/delete.md
@@ -0,0 +1,18 @@
+# Delete
+
+**Delete** allows you to remove a table from the connected source datastore. While the table and its associated data will be deleted, it is not permanent, as the table can be recreated if you run a catalog with the "recreate" option.
+
+!!! note
+ Deleting a table is a reversible action if a catalog with the "recreate" option is run later.
+
+**Step 1:** Select the tables in your connected source datastore that you would like to delete, then click on **Delete**.
+
+
+
+**Step 2:** A confirmation modal window will appear; click on the Delete button to remove the table from the system.
+
+
+
+**Step 3:** After clicking on the delete button, your table is successfully deleted and a success flash message will appear saying **"Profile has been successfully deleted"**
+
+
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/export.md b/docs/container/manage-tables-and-files/export.md
new file mode 100644
index 0000000000..201eee84d2
--- /dev/null
+++ b/docs/container/manage-tables-and-files/export.md
@@ -0,0 +1,13 @@
+# Export
+
+**Export feature** lets you capture changes in your tables. You can export metadata for Quality Checks, Field Profiles, and Anomalies from selected tables to an enrichment datastore. This helps you analyze data trends, find issues, and make better decisions based on the table data.
+
+**Step 1:** Select the tables in your JDBC datastore that you would like to export, then click on **Export**.
+
+
+
+A modal window will appear with the **Export Operation** setting.
+
+
+
+For the next steps, detailed information on the Export Operation is available in the [Export Operation](../enrichment-operation/export-operation.md){target="_blank"} section of the documentation.
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/mark-tables-&-files-as-favorite.md b/docs/container/manage-tables-and-files/mark-tables-&-files-as-favorite.md
new file mode 100644
index 0000000000..95482f68a6
--- /dev/null
+++ b/docs/container/manage-tables-and-files/mark-tables-&-files-as-favorite.md
@@ -0,0 +1,18 @@
+# Mark Tables & Files as Favorite
+
+Marking tables and files as favorite allows you to quickly access important items. This feature helps you prioritize and manage the tables and files you use frequently, making data management more efficient.
+
+!!! info
+ Favoriting a table or file is user-specific. Only you can see your favorites; it does not affect other users.
+
+**Step 1**: Locate the tables and files you want to mark as favorites and click on the bookmark icon to mark the table and file as a favorite.
+
+
+
+After clicking on the bookmark icon, your table or file is successfully marked as a favorite and a success flash message will appear stating “The Table has been favorited”.
+
+
+
+**Step 2**: To unmark tables and files, simply click on the bookmark icon of the marked tables and files. This will remove it from your favorites.
+
+
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/materialize.md b/docs/container/manage-tables-and-files/materialize.md
new file mode 100644
index 0000000000..9aecb7a484
--- /dev/null
+++ b/docs/container/manage-tables-and-files/materialize.md
@@ -0,0 +1,13 @@
+# Materialize
+
+**Materialize Operation** captures snapshots of selected containers from a source datastore and exports them to an enrichment datastore for seamless data loading. Users can run it instantly or schedule it at set intervals, ensuring structured data is readily available for analysis and integration.
+
+**Step 1:** Select the tables in your JDBC datastore that you would like to capture and export for the Materialize Operation, then click on **Materialize**.
+
+
+
+A modal window will appear with the **Materialize Operation** setting.
+
+
+
+For the next steps, detailed information on the Materialize Operation is available in the [Materialize Operation](../enrichment-operation/materialize-operation.md){target="_blank"} section of the documentation.
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/observability-settings.md b/docs/container/manage-tables-and-files/observability-settings.md
new file mode 100644
index 0000000000..461a6a0041
--- /dev/null
+++ b/docs/container/manage-tables-and-files/observability-settings.md
@@ -0,0 +1,25 @@
+# Observability Settings
+
+Observability helps you track and monitor data performance in your connected source datastore’s tables and files. It provides insights into data volume, detects anomalies, and ensures smooth data processing by identifying potential issues early. This makes it easier to manage and maintain data quality over time.
+
+**Step 1:** Select the table in your JDBC datastore that you would like to monitor, then click on **Observability.**
+
+
+
+A modal window for **Observability Settings** will appear. Here you can view the details of the table and datastore where actions have been applied.
+
+
+
+**Step 2:** Check the **Volume Tracking** to enable trend analysis and anomaly detection in data volumes over time and check the **Freshness Tracking** to ensure data timeliness and to identify pipeline delays.
+
+**Volume Tracking** monitors and records daily volume metrics for this data asset. This feature enables trend analysis and anomaly detection in data volumes over time. **Freshness Tracking** measures and records the last time data was added or updated in the data asset. This feature helps ensure data timeliness and identifies pipeline delays.
+
+
+
+**Step 3:** Click on the **Save** button.
+
+
+
+After clicking on the Save button, a success flash message will appear stating **"Profile has been successfully updated"**.
+
+
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/run.md b/docs/container/manage-tables-and-files/run.md
new file mode 100644
index 0000000000..e491454216
--- /dev/null
+++ b/docs/container/manage-tables-and-files/run.md
@@ -0,0 +1,19 @@
+# Run
+
+Execute various operations like profiling or scanning your table or file. It helps validate data quality and ensures that the table meets the defined checks and rules, providing insights into any anomalies or data issues that need attention.
+
+**Step 1:** Click on the vertical ellipse next to the table name and select **Run**.
+
+
+
+Under **Run**, choose the type of operation you want to perform:
+
+- **Profile**: To collect metadata and profile the table's contents.
+
+- **Scan**: To validate the data against defined rules and checks.
+
+
+
+To understand how a profile operation is performed, you can follow the remaining steps from the documentation [Profile Operation.](../../source-datastore/profile.md#configuration){target="_blank"}.
+
+To understand how a scan operation is performed, you can follow the remaining steps from the documentation [Scan Operation.](../../source-datastore/scan.md#configuration){target="_blank"}.
diff --git a/docs/container/manage-tables-and-files/setting-for-dfs-files-pattern.md b/docs/container/manage-tables-and-files/setting-for-dfs-files-pattern.md
new file mode 100644
index 0000000000..3aee75156f
--- /dev/null
+++ b/docs/container/manage-tables-and-files/setting-for-dfs-files-pattern.md
@@ -0,0 +1,61 @@
+# Settings For DFS Files Pattern
+
+Settings allow you to edit how data is processed and analyzed for a specific file pattern in your connected source datastore. This includes selecting fields for incremental and partitioning strategies, grouping data, excluding certain fields from scans, and adjusting general behaviors.
+
+**Step 1:** Click on the vertical ellipse next to the file pattern of your choice and select **Settings** from the dropdown list.
+
+
+
+A modal window will appear for **File Pattern Settings**.
+
+
+
+**Step 2:** Modify the table setting based on:
+
+- Group Criteria
+
+- Excluding
+
+- General
+
+## Group Criteria
+
+**Group Criteria** allow you to organize data into specific groups for more precise analysis. By grouping fields, you can gain better insights and enhance the accuracy of your profiling.
+
+
+
+For information about **Group Criteria**, you can refer to the documentation on [Grouping.](../settings/grouping.md){target="_blank"}.
+
+## Excluding
+
+**Excluding** allows you to choose specific fields from a file pattern that you want to exclude from data checks. This helps focus on the fields that matter most for validation while ignoring others that are not relevant to the current analysis.
+
+
+
+For information about **Excluding**, you can refer to the documentation on [Excluding Settings.](../settings/general.md#excluding-fields){target="_blank"}.
+
+## General
+
+You can control how file patterns behave by checking or unchecking options to make data processing easier and more consistent. These settings help the system automatically adjust file structures for better integration and analysis.
+
+
+
+* **Inferring Data Types:** When enabled, the system automatically determines the correct data type for each field and applies it. This keeps data consistent and reduces errors, saving you time on manual fixes.
+
+
+
+* **First Row as Field Names:** Turning this on uses the first row of a file as headers, making it simple to map and organize data in the right format.
+
+
+
+* **Treating Empty Values as Nulls:** The "Treat empty values as nulls" setting controls how empty fields in files like Excel and CSV are handled. If enabled (true), empty fields are treated as NULL (missing data). If disabled (false), they are stored as empty strings (""), meaning the field exists but is blank. This affects reporting, calculations, and data processing, as NULL values are ignored while empty strings may still be counted.
+
+
+
+**Step 3:** Once you have configured the file pattern settings, click on the **Save** button.
+
+
+
+After clicking on the **Save** button, your table is successfully updated and a success flash message will appear stating **"File Pattern has been successfully updated"**.
+
+
\ No newline at end of file
diff --git a/docs/container/manage-tables-and-files/setting-for-jdbc-table.md b/docs/container/manage-tables-and-files/setting-for-jdbc-table.md
new file mode 100644
index 0000000000..31995d28f9
--- /dev/null
+++ b/docs/container/manage-tables-and-files/setting-for-jdbc-table.md
@@ -0,0 +1,82 @@
+# Settings For JDBC Table
+
+Settings allow you to edit how data is processed and analyzed for a specific table in your connected source datastore. This includes selecting fields for incremental and partitioning strategies, grouping data, excluding certain fields from scans, and adjusting general behaviors.
+
+**Step 1:** Click on the vertical ellipse next to the table of your choice and select **Settings** from the dropdown list.
+
+
+
+A modal window will appear for **Table Settings**.
+
+
+
+**Step 2:** Modify the table settings based on the following options:
+
+- Identifiers
+
+- Group Criteria
+
+- Excluding
+
+- General
+
+## Identifiers
+
+An **Identifier** is a field that can be used to help load the desired data from a Table in support of analysis. For more details about identifiers, you can refer to the documentation on [Identifiers.](../settings/identifiers.md){target="_blank"}.
+
+### Incremental Strategy
+
+This is crucial for tracking changes at the row level within tables. This approach is essential for efficient data processing, as it is specifically used to track which records have already been scanned. This allows for scan operations to focus exclusively on new records that have not been previously scanned, thereby optimizing the scanning process and ensuring that only the most recent and relevant data is analyzed.
+
+!!! note
+ If you have connected a DFS datastore, no manual setup is needed for the incremental strategy, the system automatically tracks and processes the latest data changes.
+
+
+
+For information about incremental strategy, you can refer to the [Incremental Strategy](../settings/identifiers.md#partition-field){target="_blank"} section in the Identifiers documentation.
+
+### Incremental Field
+
+**Incremental Field** lets you select a field that tracks changes in your data. This ensures only new or updated records are scanned, improving efficiency and reducing unnecessary processing.
+
+
+
+### Partition Field
+
+**Partition Field** is used to divide the data in a table into distinct segments, or dataframes. These partitions allow for parallel analysis, improving efficiency and performance. By splitting the data, each partition can be processed independently. This approach helps optimize large-scale data operations.
+
+
+
+For information about **Partition Field**, you can refer to the [Partition Field](../settings/identifiers.md#partition-field){target="_blank"} section in the Identifiers documentation.
+
+## Group Criteria
+
+**Group Criteria** allow you to organize data into specific groups for more precise analysis. By grouping fields, you can gain better insights and enhance the accuracy of your profiling.
+
+
+
+For information about **Group Criteria**, you can refer to the documentation on [Grouping.](../settings/grouping.md){target="_blank"}.
+
+## Excluding
+
+**Excluding** allows you to choose specific fields from a table that you want to exclude from data checks. This helps focus on the fields that matter most for validation while ignoring others that are not relevant to the current analysis.
+
+
+
+For information about **Excluding**, you can refer to the documentation on [Excluding Settings.](../manage-tables-and-files/setting-for-dfs-files-pattern.md#excluding){target="_blank"}.
+
+## General
+
+You can control the default behavior of the specific table by checking or unchecking the option to infer the data type for each field. When checked, the system will automatically determine and cast the data types as needed for accurate data processing.
+
+
+
+For information about **General**, you can refer to the documentation on [General Settings.](../settings/general.md){target="_blank"}.
+
+**Step 3:** Once you have configured the table settings, click on the **Save** button.
+
+
+
+After clicking on the **Save** button, your table is successfully updated and a success flash message will appear stating **"Table has been successfully updated"**.
+
+
\ No newline at end of file
diff --git a/docs/container/overview-of-a-computed-field.md b/docs/container/overview-of-a-computed-field.md
deleted file mode 100644
index 659d5cbaf7..0000000000
--- a/docs/container/overview-of-a-computed-field.md
+++ /dev/null
@@ -1,290 +0,0 @@
-# Computed Fields
-
-Computed Fields allow you to enhance data analysis by applying dynamic transformations directly to your data. These fields let you create new data points, perform calculations, and customize data views based on your specific needs, ensuring your data is both accurate and actionable.
-
-Let's get started 🚀
-
-## Add Computed Fields
-
-**Step 1:** Log in to Your Qualytics Account, navigate to the side menu, and select the **source datastore** where you want to create a computed field.
-
-
-
-
-**Step 2:** Select the **Container** within the chosen datastore where you want to create the computed field. This container holds the data to which the new computed field will be applied, enabling you to enhance your data analysis within that specific datastore.
-
-For demonstration purposes, we have selected the **Bank Dataset-Staging** source datastore and the **bank_transactions_.csv** container within it to create a computed field.
-
-
-
-
-**Step 3:** After selecting the container, click on the **Add** button and select **Computed Field** from the dropdown menu to create a new computed field.
-
-
-
-
-A modal window will appear, allowing you to enter the details for your computed field.
-
-
-
-
-**Step 4:** Enter the **Name** for the computed field, select **Transformation Type** from the dropdown menu, and optionally add **Additional Metadata**.
-
-| REF. | FIELDS | ACTION |
-|------|--------|--------|
-| 1. | Field Name (Required) | Add a unique name for your computed field. |
-| 2. | Transformation Type (Required) | The type of transformation you want to apply from the available options. |
-| 3. | Additional Metadata (Optional) | Enhance the computed field definition by setting custom metadata. Click the plus icon **(+)** to open the metadata input form and add key-value pairs. |
-
-
-
-
-!!! info
- Transformations are changes made to data, like converting formats, doing calculations, or cleaning up fields. In Qualytics, you can use transformations to meet specific needs, such as cleaning entity names, converting formatted numbers, or applying custom expressions. With various transformation types available, Qualytics enables you to customize your data directly within the platform, ensuring it’s accurate and ready for analysis.
-
-| Transformation Types | Purpose | Reference |
-|------|--------|---------|
-| Cleaned Entity Name | Removes business signifiers (such as 'Inc.' or 'Corp') from an entity name. | [See here](#cleaned-entity-name) |
-| Convert Formatted Numeric | Removes formatting (such as parentheses for denoting negatives or commas as delimiters) from values that represent numeric data, converting them into a numerically typed field. | [See here](#convert-formatted-numeric) |
-| Custom Expression | Allows you to create a new field by applying any valid Spark SQL expression to one or more existing fields. | [See here](#custom-expression) |
-
-
-
-
-**Step 5:** After selecting the appropriate **Transformation Type**, click the **Save** button.
-
-
-
-
-**Step 6:** After clicking on the **Save** button, your computed field is created and a success flash message will display saying **The computed field has been successfully created**.
-
-
-
-
-You can find your computed field by clicking on the dropdown arrow next to the container you selected when creating the computed field.
-
-
-
-
-## Computed Fields Details
-
-### Totals
-
-**1. Quality Score**: This provides a comprehensive assessment of the overall health of the data, factoring in multiple checks for accuracy, consistency, and completeness. A higher score, closer to 100, indicates optimal data quality with minimal issues or errors detected. A lower score may highlight areas that require attention and improvement.
-
-**2. Sampling**: This shows the percentage of data that was evaluated during profiling. A sampling rate of 100% indicates that the entire dataset was analyzed, ensuring a complete and accurate representation of the data’s quality across all records, rather than just a partial sample.
-
-**3. Completeness**: This metric measures how fully the data is populated without missing or null values. A higher completeness percentage means that most fields contain the necessary information, while a lower percentage indicates data gaps that could negatively impact downstream processes or analysis.
-
-**4. Active Checks**: This refers to the number of ongoing quality checks being applied to the dataset. These checks monitor aspects such as format consistency, uniqueness, and logical correctness. Active checks help maintain data integrity and provide real-time alerts about potential issues that may arise.
-
-**5. Active Anomalies**: This tracks the number of anomalies or irregularities detected in the data. These could include outliers, duplicates, or inconsistencies that deviate from expected patterns. A count of zero indicates no anomalies, while a higher count suggests that further investigation is needed to resolve potential data quality issues.
-
-
-
-
-### Profile
-
-This provides detailed insights into the characteristics of the field, including its type, distinct values, and length. You can use this information to evaluate the data's uniqueness, length consistency, and complexity.
-
-| **No** | **Profile** | **Description** |
-|--------|-----------------------|---------------------------------------------------------------------------------|
-| 1 | Declared Type | Indicates whether the type is declared by the source or inferred. |
-| 2 | Distinct Values | Count of distinct values observed in the dataset. |
-| 3 | Min Length | Shortest length of the observed string values or lowest value for numerics. |
-| 4 | Max Length | Greatest length of the observed string values or highest value for numerics. |
-| 5 | Mean | Mathematical average of the observed numeric values. |
-| 6 | Median | The median of the observed numeric values. |
-| 7 | Standard Deviation | Measure of the amount of variation in observed numeric values. |
-| 8 | Kurtosis | Measure of the ‘tailedness’ of the distribution of observed numeric values. |
-| 9 | Skewness | Measure of the asymmetry of the distribution of observed numeric values. |
-| 10 | Q1 | The first quartile; the central point between the minimum and the median. |
-| 11 | Q3 | The third quartile; the central point between the median and the maximum. |
-| 12 | Sum | Total sum of all observed numeric values. |
-
-
-
-
-You can hover over the **(i)** button to view the native field properties, which provide detailed information such as the field's type (numeric), size, decimal digits, and whether it allows null values.
-
-
-
-
-#### Last Profile
-
-The **Last Profile** timestamp helps users understand how up to date the field is. When you hover over the time indicator shown on the right side of the Last Profile label (e.g., "8 months ago"), a tooltip displays the complete date and time the field was last profiled.
-
-
-
-
-This visibility ensures better context for interpreting profile metrics like mean, completeness, and anomalies.
-
-## Manage Tags in field details
-
-Tags can now be directly managed in the field profile within the Explore section. Simply access the Field Details panel to create, add, or remove tags, enabling more efficient and organized data management.
-
-**Step 1**: Log in to your Qualytics account and click the **Explore** button on the left side panel of the interface.
-
-
-
-
-**Step 2**: Click on the **Profiles** tab and select **fields**.
-
-
-
-
-**Step 3**: Click on the specific field that you want to manage tags.
-
-
-
-
-A **Field Details** modal window will appear. Click on the plus button **(+)** to assign tags to the selected field.
-
-
-
-
-**Step 4:** You can also create the new tag by clicking on the ➕ button.
-
-
-
-
-A modal window will appear, providing the options to create the tag. Enter the required values to get started.
-
-
-
-
-For more information on creating tags, refer to the [Add Tag section](../tags/add-tag.md).
-
-## View Team
-
-By hovering over the information icon, users can view the assigned teams for enhanced collaboration and data transparency.
-
-
-
-
-## Filter and Sort Fields
-
-Filter and Sort options allow you to organize your fields by various criteria, such as Name, Checks, Completeness, Created Date, and Tags. You can also apply filters to refine your list of fields based on Type and Tags
-
-### Sort
-
-You can sort your checks by Active Anomalies, Active Checks, Completeness, Created Date, Name, Quality Score, and Type to easily organize and prioritize them according to your needs.
-
-
-
-
-| **No** | **Sort By** | **Description** |
-|--------|-----------------------|---------------------------------------------------------------------------------|
-| 1 | Active Anomalies | Sorts fields based on the number of currently active anomalies detected. |
-| 2 | Active Checks | Sorts fields by the number of active validation checks applied. |
-| 3 | Completeness | Sorts fields based on their data completeness percentage. |
-| 4 | Created Date | Sorts fields by the date they were created, showing the newest or oldest fields first. |
-| 5 | Name | Sorts fields alphabetically by their names. |
-| 6 | Quality Score | Sorts fields based on their quality score, indicating the reliability of the data in the field. |
-| 7 | Type | Sorts fields based on their data type (e.g., string, boolean, etc.). |
-
-
-
-
-Whatever sorting option is selected, you can arrange the data either in ascending or descending order by clicking the caret button next to the selected sorting criteria.
-
-
-
-
-### Filter
-
-You can filter your fields based on values like Type and Tag to easily organize and prioritize them according to your needs.
-
-{% include-markdown "components/general-props/typos.md"
- start=''
- end=''
-%}
-
-
-
-
-
-| **No** | **Filter** | **Description** |
-|--------|-----------------------|---------------------------------------------------------------------------------|
-| 1 | Type | Filters fields based on the data type (e.g., string, boolean, date, etc.). |
-| 2 | Tag | Tag Filter displays only the tags associated with the currently visible items, along with their color icon, name, type, and the number of matching records. Selecting one or more tags refines the list based on your selection. If no matching items are found, a No option found message is displayed.|
-
-## Types of Transformations
-
-### Cleaned Entity Name
-
-This transformation removes common business signifiers from entity names, making your data cleaner and more uniform.
-
-#### Options for Cleaned Entity Name
-
-| REF. | FIELDS | ACTIONS |
-|------|--------|---------|
-| 1. | **Drop from Suffix** | Add a unique name for your computed field. |
-| 2. | **Drop from Prefix** | Removes specified terms from the beginning of the entity name. |
-| 3. | **Drop from Interior** | Removes specified terms from the beginning of the entity name. |
-| 4. | **Additional Terms to Drop** (Custom) | Allows you to specify additional terms that should be dropped from the entity name. |
-| 5. | **Terms to Ignore** (Custom) | Designate terms that should be ignored during the cleaning process. |
-
-#### Example for Cleaned Entity Name
-
-| **Example** | **Input** | **Transformation** | **Output** |
-|-------------|----------------------------|--------------------------------|--------------------------|
-| 1 | "TechCorp, Inc." | **Drop from Suffix**: "Inc." | "TechCorp" |
-| 2 | "Global Services Ltd." | **Drop from Prefix**: "Global" | "Services Ltd." |
-| 3 | "Central LTD & Finance Co." | **Drop from Interior**: "LTD" | "Central & Finance Co." |
-| 4 | "Eat & Drink LLC" | **Additional Terms to Drop**: "LLC", "&" | "Eat Drink" |
-| 5 | "ProNet Solutions Ltd." | **Terms to Ignore**: "Ltd." | "ProNet Solutions" |
-
-### Convert Formatted Numeric
-
-This transformation converts formatted numeric values into a plain numeric format, stripping out any characters like commas or parentheses that are not numerically significant.
-
-
-#### Example for Convert Formatted Numeric
-
-| **Example** | **Input** | **Transformation** | **Output** |
-|-------------|-------------|--------------------|------------|
-| 1 | "$1,234.56" | **Remove non-numeric characters**: ",", "$" | "1234.56" |
-| 2 | "(2020)" | **Remove non-numeric characters**: "(", ")" | "-2020" |
-| 3 | "100%" | **Remove non-numeric characters**: "%" | "100"
-
-### Custom Expression
-
-Enables the creation of a field based on a custom computation using Spark SQL. This is useful for applying complex logic or transformations that are not covered by other types.
-
-#### Using Custom Expression:
- You can combine multiple fields, apply conditional logic, or use any valid Spark SQL functions to derive your new computed field.
-
- **Example**: To create a field that sums two existing fields, you could use the expression `SUM(field1, field2)`.
-
- **Advanced Example**: You need to ensure that a log of leases has no overlapping dates for an asset but your data only captures a single lease's details like
-
-| **LeaseID** | **AssetID** | **Lease_Start** | **Lease_End** |
-|-------------|-------------|--------------------|------------|
-| 1 | 42 | 1/1/2025 | 2/1/2026 |
-| 2 | 43 | 1/1/2025 | 2/1/2026 |
-| 3 | 42 | 1/1/2026 | 2/1/2026 |
-| 4 | 43 | 2/2/2026 | 2/1/2027 |
-
-You can see in this example that Lease 1 has overlapping dates with Lease 3 for the same Asset. This can be difficult to detect without a full transformation of the data. However, we can accomplish our goal easily with a Computed Field.
-We'll simply add a Computed Field to our table named "Next_Lease_Start" and define it with the following custom expression so that our table will now hold the new field and render it as shown below.
-
-`LEAD(Lease_Start, 1) OVER (PARTITION BY AssetID ORDER BY Lease_Start)`
-
- | **LeaseID** | **AssetID** | **Lease_Start** | **Lease_End** | **Next_Lease_Start** |
-|-------------|-------------|--------------------|------------|------------|
-| 1 | 42 | 1/1/2025 | 2/1/2026 | 1/1/2026 |
-| 2 | 43 | 1/1/2025 | 2/1/2026 | 2/2/2026 |
-| 3 | 42 | 1/1/2026 | 2/1/2026 | |
-| 4 | 43 | 2/2/2026 | 2/1/2027 | |
-
-Now you can author a Quality Check stating that Lease_End should always be less than "Next_Lease_Start" to catch any errors of this type. In fact, Qualytics will automatically infer that check for you at [Level 3 Inference](../source-datastore/profile.md#levels-of-check-inference)!
-
-#### More Examples for Custom Expression
-
-| **Example** | **Input Fields** | **Custom Expression** | **Output** |
-|-------------|--------------|-----------------|---------------------|
-| 1 | `field1 = 10`, `field2 = 20` | `SUM(field1, field2)` | `30` |
-| 2 | `salary = 50000`, `bonus = 5000` | `salary + bonus` | `55000` |
-| 3 | `hours = 8`, `rate = 15.50` | `hours * rate` | `124` |
-| 4 | `status = 'active'`, `score = 85` | `CASE WHEN status = 'active' THEN score ELSE 0 END` | `85` |
diff --git a/docs/container/overview.md b/docs/container/overview.md
new file mode 100644
index 0000000000..2f8c509306
--- /dev/null
+++ b/docs/container/overview.md
@@ -0,0 +1,67 @@
+# Containers Overview
+
+Containers are fundamental entities representing structured data sets. These containers could manifest as tables in JDBC datastores or as files within DFS datastores. They play a pivotal role in data organization, profiling, and quality checks within the Qualytics application.
+
+Let’s get started 🚀
+
+## Container Types
+
+There are two main types of containers in Qualytics:
+
+### JDBC Container
+
+JDBC containers are virtual representations of database objects, making it easier to work with data stored in relational databases.
+
+!!! note
+ For more information, please refer to the [jdbc container section](../container/container-types.md#jdbc-container){target="_blank"}.
+
+### DFS Container
+
+DFS containers are used to represent files stored in distributed file systems, such as Hadoop or cloud storage.
+
+!!! note
+ For more information, please refer to the [DFS container section](../container/container-types.md#dfs-container){target="_blank"}.
+
+## Computed Tables
+
+Computed Tables allow you to create derived tables using SQL-like transformations on your source datastore containers.
+
+!!! note
+ For more information, please refer to the [Computed Tables Documentation](../container/computed-tables-and-files/computed-tables.md){target="_blank"}.
+
+## Computed Files
+
+Computed Files allow you to generate new file-based datasets derived from existing containers within your selected datastore.
+
+!!! note
+ For more information, please refer to the [Computed Files Documentation](../container/computed-tables-and-files/computed-files.md){target="_blank"}.
+
+## Computed Joins
+
+A Computed Join Container allows you to combine data from two containers, which can be from the same source datastore or different source datastores (e.g., a database table vs. a file system container).
+
+!!! note
+ For more information, please refer to the [Computed Join Documentation](../container/computed-join.md){target="_blank"}.
+
+## Container Attributes
+
+### Totals
+
+Totals are calculated from sampled data, not the full dataset. Values may differ from actual totals across all records.
+
+!!! note
+ For more information, please refer to the [container attributes documentation](../container/container-attributes.md){target="_blank"}.
+
+## Actions on Container
+
+Users can perform various operations on containers to manage datasets effectively. The actions are divided into three main sections: **Settings**, **Add**, and **Run**. Each section contains specific options to perform different tasks.
+
+!!! note
+ For more information, please refer to the [actions on container documentation](../container/actions-on-container.md){target="_blank"}.
+
+## Field Profiles
+
+After profiling a container, individual field profiles offer granular insights:
+
+!!! note
+ For more information, please refer to the [field profiles documentation](../container/field-profiles.md){target="_blank"}.
diff --git a/docs/container/overview-of-infer-data-type.md b/docs/container/settings/general.md
similarity index 82%
rename from docs/container/overview-of-infer-data-type.md
rename to docs/container/settings/general.md
index fe8d98b9d8..96d7445f79 100644
--- a/docs/container/overview-of-infer-data-type.md
+++ b/docs/container/settings/general.md
@@ -8,31 +8,31 @@ Let’s get started 🚀
**Step 1:** Log in to your Qualytics account and select the source datastore (**JDBC** or **DFS**) from the left menu that you want to manage.
-
+
**Step 2:** Select Tables (if JDBC datastore is connected) or File Patterns (if DFS datastore is connected) from the Navigation tab on the top.
-
+
**Step 3:** You will view the full list of tables or files belonging to the selected source datastore.
-
+
**Step 4:** Click on the vertical ellipse next to the table of your choice and select **Settings** from the dropdown list.
-
+
-A modal window will appear for **“Table Settings”**, where you can manage general and excluding for the selected table.
+A modal window will appear for **Table Settings**, where you can manage general and excluding for the selected table.
-
+
## Excluding Fields
-This configuration allows you to selectively exclude specific fields from containers. These excluded fields will be omitted from check creation during profiling operations while also being hidden in data previews, without requiring a profile run.
+This configuration allows you to selectively exclude specific fields from containers. These excluded fields will be omitted from check creation during profiling operations and will also be hidden in data previews immediately, without requiring a profile run.
This can be helpful when dealing with sensitive data, irrelevant information, or large datasets where you want to focus on specific fields.
-
+
### Benefits of Excluding Fields
@@ -52,7 +52,7 @@ Excluding fields will permanently remove them from profile creation and data pre
The "infer data type" option in containers allows the system to automatically determine the appropriate data types (e.g., fractional, integer, date) for columns within your data containers. This setting is configurable for both JDBC and DFS containers.
-
+
### Behavior in JDBC Datastores
@@ -90,7 +90,7 @@ In some cases, you may have multiple files that share the same schema but don't
### Explore Deeper Knowledge
-If you want to go deeper into the knowledge or if you are curious and want to learn more about DFS filename globbing, you can explore our comprehensive guide here: [How DFS Filename Globbing Works](https://userguide.qualytics.io/dfs-globbing/how-dfs-filename-globbing-works/).
+If you want to go deeper into the knowledge or if you are curious and want to learn more about DFS filename globbing, you can explore our comprehensive guide here: [How DFS Filename Globbing Works](../../dfs-globbing/how-dfs-filename-globbing-works.md){target="_blank"}.
### Important Considerations
diff --git a/docs/container/overview-of-grouping.md b/docs/container/settings/grouping.md
similarity index 73%
rename from docs/container/overview-of-grouping.md
rename to docs/container/settings/grouping.md
index 51366eb87d..936da38273 100644
--- a/docs/container/overview-of-grouping.md
+++ b/docs/container/settings/grouping.md
@@ -4,27 +4,27 @@ Grouping is a fundamental aspect of data analysis, allowing users to organize da
Let’s get started 🚀
-## Managing an Grouping
+## Managing a Grouping
**Step 1:** Log in to your Qualytics account and select the source datastore (**JDBC** or **DFS**) from the left menu that you want to manage.
-
+
**Step 2:** Select Tables (if JDBC datastore is connected) or File Patterns (if DFS datastore is connected) from the Navigation tab on the top.
-
+
**Step 3:** You will view the full list of tables or files belonging to the selected source datastore.
-
+
**Step 4:** Click on the vertical ellipse next to the table of your choice and select **Settings** from the dropdown list.
-
+
-A modal window will appear for **“Table Settings”**, where you can manage grouping for the selected table.Use the **Grouping** section to organize fields, with a warning to avoid large row groupings to maintain performance. Add grouping logic via **Group Criteria**.
+A modal window will appear for **Table Settings**, where you can manage grouping for the selected table. Use the **Grouping** section to organize fields, with a warning to avoid large row groupings to maintain performance. Add grouping logic via **Group Criteria**.
-
+
## Usage
@@ -34,15 +34,15 @@ The `grouping` parameter accepts a list of lists of field names. Each inner list
Consider the following examples of `grouping` configurations:
-1. `["store_id"]`: Groups data within the container by the `store_id` field.
-2. `["store_id", "month"]`: Groups data first by `store_id`, then by `month`.
-3. `["store_id", "state"]`: Groups data first by `store_id`, then by `state`.
+**1** `["store_id"]`: Groups data within the container by the `store_id` field.
+**2** `["store_id", "month"]`: Groups data first by `store_id`, then by `month`.
+**3** `["store_id", "state"]`: Groups data first by `store_id`, then by `state`.
By specifying different combinations of fields in the `grouping` parameter, users can tailor the grouping behavior to suit their analytical needs.
## Impact on Data Profiles
-The grouping has implications for various aspects of data profiling and analysis within Qualytics.
+The grouping configuration has implications for various aspects or Grouping has implications for various aspects.
### Field Profiles
diff --git a/docs/container/overview-of-identifiers.md b/docs/container/settings/identifiers.md
similarity index 86%
rename from docs/container/overview-of-identifiers.md
rename to docs/container/settings/identifiers.md
index a3a73f89f2..20d1ad8b90 100644
--- a/docs/container/overview-of-identifiers.md
+++ b/docs/container/settings/identifiers.md
@@ -1,6 +1,6 @@
# Identifiers
-An **Identifier** is a field that can be used to help load the desired data from a table in support of analysis. There are two types of identifiers that can be declared for a table:
+An **Identifier** is a field that can be used to help load the desired data from a table in support of the analysis. There are two types of identifiers that can be declared for a table:
* **Incremental Field:** Track records in the table that have already been scanned in order to support Scan operations that only analyze new (not previously scanned) data.
@@ -10,23 +10,23 @@ An **Identifier** is a field that can be used to help load the desired data from
**Step 1:** Log in to your Qualytics account and select the source datastore (**JDBC** or **DFS**) from the left menu that you want to manage.
-
+
**Step 2:** Select Tables (if JDBC datastore is connected) or File Patterns (if DFS datastore is connected) from the Navigation tab on the top.
-
+
**Step 3:** You will view the full list of tables or files belonging to the selected source datastore.
-
+
**Step 4:** Click on the vertical ellipse next to the table of your choice and select **Settings** from the dropdown list.
-
+
-A modal window will appear for **“Table Settings”**, where you can manage identifiers for the selected table.
+A modal window will appear for **Table Settings**, where you can manage identifiers for the selected table.
-
+
## Incremental Strategy
@@ -36,26 +36,26 @@ This approach is essential for efficient data processing, as it is specifically
This allows for scan operations to focus exclusively on new records that have not been previously scanned, thereby optimizing the scanning process and ensuring that only the most recent and relevant data is analyzed.
-
+
-| No | Strategy Option | Description |
+| No. | Strategy Option | Description |
| :---- | :---- | :---- |
-| 1 | **None** | No incremental strategy, it will run full. |
+| 1 | **None** | No incremental strategy, runs full table scans. |
| 2 | **Last Modified** | - Available types are **Date** or **Timestamp** was last modified.
- Uses a "last modified column" to track changes in the data set.
- This column typically contains a timestamp or date value indicating when a record was last modified.
- The system compares the "last modified column" to a previous timestamp or date, updating only the records modified since that time. |
| 3 | **Batch Value** | - Available types are **Integral** or **Fractional**.
- Uses a "batch value column" to track changes in the data set.
- This column typically contains an incremental value that increases as new data is added.
- The system compares the current "batch value" with the previous one, updating only records with a higher "batch value".
- Useful when data comes from a system without a modification timestamp. |
| 4 | **Postgres Commit Timestamp Tracking** | - Utilizes commit timestamps for change tracking. |
Availability based on technologies:
-| Option | Availability |
+| Option | Availability |
|-----------------------------------------|---------------|
| **Last Modified** | All |
| **Batch Value** | All |
| **Postgres Commit Timestamp Tracking** | PostgreSQL |
!!! info
- - All options are useful for incremental strategy, it depends on the availability of the data and how it is modeled.
- - The 3 options will allow you to track and process only the data that has changed since the last time the system was run, reducing the amount of data that needs to be read and processed, and increasing the efficiency of your system.
+ - All options are useful for incremental strategy; it depends on the availability of the data and how it is modeled.
+ - The three options will allow you to track and process only the data that has changed since the last time the system was run, reducing the amount of data that needs to be read and processed, and increasing the efficiency of your system.
### Incremental Strategy with DFS (Distributed File System)
@@ -69,7 +69,7 @@ This automated process means that DFS users do not need to manually configure th
**Sample Data**
-| O_ORDERKEY | O_PAYMENT_DETAILS |LAST_MODIFIED |
+| O_ORDERKEY | O_PAYMENT_DETAILS |LAST_MODIFIED |
|------------|----------------------------------------------------------------------------------|-------------------------
| 1 | {"date": "2023-09-25", "amount": 250.50, "credit_card": "5105105105105100"} | 2023-09-25 10:00:00
| 2 | {"date": "2023-09-25", "amount": 150.75, "credit_card": "4111-1111-1111-1111"} | 2023-09-25 10:30:00
@@ -115,7 +115,7 @@ This not only increases the efficiency of data processing but also ensures a mor
The ideal Partition Identifier is an Incremental Identifier of type **datetime** such as a last-modified field, however, alternatives are automatically identified and set during a Catalog operation.
-
+
!!! info
* **Partition Field Selection**: When selecting a partition field for a table during catalog operation, we will attempt to select a field with no nulls where possible.
diff --git a/mkdocs.yml b/mkdocs.yml
index dccfbd92f3..c982a8ab93 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -83,19 +83,39 @@ nav:
- Edit Enrichment: enrichment/edit-enrichment.md
- Delete Enrichment: enrichment/delete-enrichment.md
- Containers:
- - Overview: container/container.md
- - Data Preview: container/data-preview.md
- - Computed Tables and Files: container/computed-tables-and-files.md
+ - Overview: container/overview.md
+ - Container Types: container/container-types.md
+ - Container Attributes: container/container-attributes.md
+ - Actions on Container: container/actions-on-container.md
+ - Manage Tables and Files:
+ - Settings For JDBC Table: container/manage-tables-and-files/setting-for-jdbc-table.md
+ - Settings For DFS Files Pattern: container/manage-tables-and-files/setting-for-dfs-files-pattern.md
+ - Identifiers: container/settings/identifiers.md
+ - Grouping: container/settings/grouping.md
+ - General: container/settings/general.md
+ - Add Checks: container/manage-tables-and-files/add-checks.md
+ - Run: container/manage-tables-and-files/run.md
+ - Observability Settings: container/manage-tables-and-files/observability-settings.md
+ - Export: container/manage-tables-and-files/export.md
+ - Materialize: container/manage-tables-and-files/materialize.md
+ - Delete: container/manage-tables-and-files/delete.md
+ - Mark Tables & Files as Favorite: container/manage-tables-and-files/mark-tables-&-files-as-favorite.md
+ - Field Profiles: container/field-profiles.md
+ - Computed Tables and Files:
+ - Overview: container/computed-tables-and-files/overview.md
+ - Computed Tables: container/computed-tables-and-files/computed-tables.md
+ - Computed Files: container/computed-tables-and-files/computed-files.md
+ - Computed Table VS File: container/computed-tables-and-files/computed-table-vs-file.md
- Computed Join: container/computed-join.md
- - Manage Tables and Files: container/manage-tables-and-files.md
- - Computed Fields: container/overview-of-a-computed-field.md
+ - Computed Fields:
+ - Overview: container/computed-fields/overview.md
+ - Transformation Types: container/computed-fields/transformation-types.md
+ - Computed Fields Details: container/computed-fields/computed-fields-details.md
+ - Add Computed Fields: container/computed-fields/add-computed-fields.md
+ - Data Preview: container/data-preview.md
- Enrichment Operation:
- - Export Operation: container/export-operation.md
- - Materialize Operation: container/materialize-operation.md
- - Settings:
- - Identifiers: container/overview-of-identifiers.md
- - Grouping: container/overview-of-grouping.md
- - General: container/overview-of-infer-data-type.md
+ - Export Operation: container/enrichment-operation/export-operation.md
+ - Materialize Operation: container/enrichment-operation/materialize-operation.md
- Weight:
- Weight Mechanism: weight/weighting.md
- Data Quality Checks: