-This section will not appear if you are using Postgres, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values.
+This section will not appear if you are using Postgres, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
-This section will not appear if you are using Redshift, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values.
+This section will not appear if you are using Redshift, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
@@ -141,13 +141,13 @@ This section will not appear if you are using Redshift, as all values are inferr
-This section will not appear if you are using Bigquery, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values.
+This section will not appear if you are using Bigquery, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
-This section will not appear if you are using Spark, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values.
+This section will not appear if you are using Spark, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
@@ -168,6 +168,8 @@ This section will not appear if you are using Spark, as all values are inferred
This section allows you to determine the credentials that should be used when connecting to your warehouse. The authentication methods may differ depending on the warehouse and dbt Cloud tier you are on.
+For all warehouses, use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override missing or inactive (grayed-out) settings. For credentials, we recommend wrapping extended attributes in [environment variables](/docs/build/environment-variables) (`password: '{{ env_var(''DBT_ENV_SECRET_PASSWORD'') }}'`) to avoid displaying the secret value in the text box and the logs.
+
@@ -221,6 +223,8 @@ This section allows you to determine the credentials that should be used when co
- **Dataset**: Target dataset
+Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override missing or inactive (grayed-out) settings. For credentials, we recommend wrapping extended attributes in [environment variables](/docs/build/environment-variables) (`password: '{{ env_var(''DBT_ENV_SECRET_PASSWORD'') }}'`) to avoid displaying the secret value in the text box and the logs.
+
diff --git a/website/docs/docs/deploy/deployment-tools.md b/website/docs/docs/deploy/deployment-tools.md
index cca2368f38a..81c798b7d8c 100644
--- a/website/docs/docs/deploy/deployment-tools.md
+++ b/website/docs/docs/deploy/deployment-tools.md
@@ -5,13 +5,13 @@ sidebar_label: "Integrate with other tools"
pagination_next: null
---
-Alongside [dbt Cloud](/docs/deploy/jobs), discover other ways to schedule and run your dbt jobs with the help of tools such as Airflow, Prefect, Dagster, automation server, Cron, and Azure Data Factory (ADF),
+Alongside [dbt Cloud](/docs/deploy/jobs), discover other ways to schedule and run your dbt jobs with the help of tools such as the ones described on this page.
Build and install these tools to automate your data workflows, trigger dbt jobs (including those hosted on dbt Cloud), and enjoy a hassle-free experience, saving time and increasing efficiency.
## Airflow
-If your organization is using [Airflow](https://airflow.apache.org/), there are a number of ways you can run your dbt jobs, including:
+If your organization uses [Airflow](https://airflow.apache.org/), there are a number of ways you can run your dbt jobs, including:
@@ -33,9 +33,13 @@ Invoking dbt Core jobs through the [BashOperator](https://registry.astronomer.io
For more details on both of these methods, including example implementations, check out [this guide](https://docs.astronomer.io/learn/airflow-dbt-cloud).
+## Automation servers
+
+Automation servers (such as CodeDeploy, GitLab CI/CD ([video](https://youtu.be/-XBIIY2pFpc?t=1301)), Bamboo and Jenkins) can be used to schedule bash commands for dbt. They also provide a UI to view logging to the command line, and integrate with your git repository.
+
## Azure Data Factory
-Integrate dbt Cloud and [Azure Data Factory](https://learn.microsoft.com/en-us/azure/data-factory/) (ADF) for a smooth data process, from data ingestion to data transformation. You can seamlessly trigger dbt Cloud jobs upon completion of ingestion jobs by using the [dbt API](/docs/dbt-cloud-apis/overview) in ADF. Need help building this out? [Contact us](https://www.getdbt.com/contact/) today!
+Integrate dbt Cloud and [Azure Data Factory](https://learn.microsoft.com/en-us/azure/data-factory/) (ADF) for a smooth data process from data ingestion to data transformation. You can seamlessly trigger dbt Cloud jobs upon completion of ingestion jobs by using the [dbt API](/docs/dbt-cloud-apis/overview) in ADF.
The following video provides you with a detailed overview of how to trigger a dbt Cloud job via the API in Azure Data Factory.
@@ -53,10 +57,42 @@ To use the dbt API to trigger a job in dbt Cloud through ADF:
5. Trigger the pipeline in ADF to start the dbt Cloud job and monitor the status of the dbt Cloud job in ADF.
6. In dbt Cloud, you can check the status of the job and how it was triggered in dbt Cloud.
+## Cron
+
+Cron is a decent way to schedule bash commands. However, while it may seem like an easy route to schedule a job, writing code to take care of all of the additional features associated with a production deployment often makes this route more complex compared to other options listed here.
+
+## Dagster
+
+If your organization uses [Dagster](https://dagster.io/), you can use the [dagster_dbt](https://docs.dagster.io/_apidocs/libraries/dagster-dbt) library to integrate dbt commands into your pipelines. This library supports the execution of dbt through dbt Cloud or dbt Core. Running dbt from Dagster automatically aggregates metadata about your dbt runs. Refer to the [example pipeline](https://dagster.io/blog/dagster-dbt) for details.
+
+## Databricks workflows
+
+Use Databricks workflows to call the dbt Cloud job API, which has several benefits such as integration with other ETL processes, utilizing dbt Cloud job features, separation of concerns, and custom job triggering based on custom conditions or logic. These advantages lead to more modularity, efficient debugging, and flexibility in scheduling dbt Cloud jobs.
+
+For more info, refer to the guide on [Databricks workflows and dbt Cloud jobs](/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs).
+
+## Kestra
+
+If your organization uses [Kestra](http://kestra.io/), you can leverage the [dbt plugin](https://kestra.io/plugins/plugin-dbt) to orchestrate dbt Cloud and dbt Core jobs. Kestra's user interface (UI) has built-in [Blueprints](https://kestra.io/docs/user-interface-guide/blueprints), providing ready-to-use workflows. Navigate to the Blueprints page in the left navigation menu and [select the dbt tag](https://demo.kestra.io/ui/blueprints/community?selectedTag=36) to find several examples of scheduling dbt Core commands and dbt Cloud jobs as part of your data pipelines. After each scheduled or ad-hoc workflow execution, the Outputs tab in the Kestra UI allows you to download and preview all dbt build artifacts. The Gantt and Topology view additionally render the metadata to visualize dependencies and runtimes of your dbt models and tests. The dbt Cloud task provides convenient links to easily navigate between Kestra and dbt Cloud UI.
+
+## Orchestra
+
+If your organization uses [Orchestra](https://getorchestra.io), you can trigger dbt jobs using the dbt Cloud API. Create an API token from your dbt Cloud account and use this to authenticate Orchestra in the [Orchestra Portal](https://app.getorchestra.io). For details, refer to the [Orchestra docs on dbt Cloud](https://orchestra-1.gitbook.io/orchestra-portal/integrations/transformation/dbt-cloud).
+
+Orchestra automatically collects metadata from your runs so you can view your dbt jobs in the context of the rest of your data stack.
+
+The following is an example of the run details in dbt Cloud for a job triggered by Orchestra:
+
+
+
+The following is an example of viewing lineage in Orchestra for dbt jobs:
+
+
+
## Prefect
-If your organization is using [Prefect](https://www.prefect.io/), the way you will run your jobs depends on the dbt version you're on, and whether you're orchestrating dbt Cloud or dbt Core jobs. Refer to the following variety of options:
+If your organization uses [Prefect](https://www.prefect.io/), the way you will run your jobs depends on the dbt version you're on, and whether you're orchestrating dbt Cloud or dbt Core jobs. Refer to the following variety of options:
@@ -106,30 +142,6 @@ If your organization is using [Prefect](https://www.prefect.io/), the way you wi
-## Dagster
-
-If your organization is using [Dagster](https://dagster.io/), you can use the [dagster_dbt](https://docs.dagster.io/_apidocs/libraries/dagster-dbt) library to integrate dbt commands into your pipelines. This library supports the execution of dbt through dbt Cloud, dbt Core, and the dbt RPC server. Running dbt from Dagster automatically aggregates metadata about your dbt runs. Refer to the [example pipeline](https://dagster.io/blog/dagster-dbt) for details.
-
-## Kestra
-
-If your organization uses [Kestra](http://kestra.io/), you can leverage the [dbt plugin](https://kestra.io/plugins/plugin-dbt) to orchestrate dbt Cloud and dbt Core jobs. Kestra's user interface (UI) has built-in [Blueprints](https://kestra.io/docs/user-interface-guide/blueprints), providing ready-to-use workflows. Navigate to the Blueprints page in the left navigation menu and [select the dbt tag](https://demo.kestra.io/ui/blueprints/community?selectedTag=36) to find several examples of scheduling dbt Core commands and dbt Cloud jobs as part of your data pipelines. After each scheduled or ad-hoc workflow execution, the Outputs tab in the Kestra UI allows you to download and preview all dbt build artifacts. The Gantt and Topology view additionally render the metadata to visualize dependencies and runtimes of your dbt models and tests. The dbt Cloud task provides convenient links to easily navigate between Kestra and dbt Cloud UI.
-
-## Automation servers
-
-Automation servers, like CodeDeploy, GitLab CI/CD ([video](https://youtu.be/-XBIIY2pFpc?t=1301)), Bamboo and Jenkins, can be used to schedule bash commands for dbt. They also provide a UI to view logging to the command line, and integrate with your git repository.
-
-## Cron
-
-Cron is a decent way to schedule bash commands. However, while it may seem like an easy route to schedule a job, writing code to take care of all of the additional features associated with a production deployment often makes this route more complex compared to other options listed here.
-
-## Databricks workflows
-
-Use Databricks workflows to call the dbt Cloud job API, which has several benefits such as integration with other ETL processes, utilizing dbt Cloud job features, separation of concerns, and custom job triggering based on custom conditions or logic. These advantages lead to more modularity, efficient debugging, and flexibility in scheduling dbt Cloud jobs.
-
-For more info, refer to the guide on [Databricks workflows and dbt Cloud jobs](/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs).
-
-
-
## Related docs
- [dbt Cloud plans and pricing](https://www.getdbt.com/pricing/)
diff --git a/website/docs/docs/deploy/job-commands.md b/website/docs/docs/deploy/job-commands.md
index 26fe1931db6..8117178b2d6 100644
--- a/website/docs/docs/deploy/job-commands.md
+++ b/website/docs/docs/deploy/job-commands.md
@@ -35,7 +35,7 @@ Every job invocation automatically includes the [`dbt deps`](/reference/commands
For every job, you have the option to select the [Generate docs on run](/docs/collaborate/build-and-view-your-docs) or [Run source freshness](/docs/deploy/source-freshness) checkboxes, enabling you to run the commands automatically.
-**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Build and view your docs](/docs/collaborate/build-and-view-your-docs) for more info.
+**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Set up documentation job](/docs/collaborate/build-and-view-your-docs) for more info.
**Job outcome Source freshness checkbox** — dbt Cloud executes the `dbt source freshness` command as the first run step in your job. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Source freshness](/docs/deploy/source-freshness) for more info.
diff --git a/website/docs/docs/deploy/source-freshness.md b/website/docs/docs/deploy/source-freshness.md
index ab267b6d067..a409c01f82c 100644
--- a/website/docs/docs/deploy/source-freshness.md
+++ b/website/docs/docs/deploy/source-freshness.md
@@ -12,7 +12,7 @@ dbt Cloud provides a helpful interface around dbt's [source data freshness](/doc
[`dbt build`](reference/commands/build) does _not_ include source freshness checks when building and testing resources in your DAG. Instead, you can use one of these common patterns for defining jobs:
- Add `dbt build` to the run step to run models, tests, and so on.
-- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs#set-up-a-documentation-job).
+- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs).
- Select the **Run source freshness** checkbox to enable [source freshness](#checkbox) as the first step of the job.
@@ -42,4 +42,4 @@ It's important that your freshness jobs run frequently enough to snapshot data l
## Further reading
- Refer to [Artifacts](/docs/deploy/artifacts) for more info on how to create dbt Cloud artifacts, share links to the latest documentation, and share source freshness reports with your team.
-- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation).
\ No newline at end of file
+- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation).
diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md
index 980915a2c42..5301dae396d 100644
--- a/website/docs/docs/introduction.md
+++ b/website/docs/docs/introduction.md
@@ -61,7 +61,7 @@ As a dbt user, your main focus will be on writing models (select queries) that r
| Handle boilerplate code to materialize queries as relations | For each model you create, you can easily configure a *materialization*. A materialization represents a build strategy for your select query – the code behind a materialization is robust, boilerplate SQL that wraps your select query in a statement to create a new, or update an existing, relation. Read more about [Materializations](/docs/build/materializations).|
| Use a code compiler | SQL files can contain Jinja, a lightweight templating language. Using Jinja in SQL provides a way to use control structures in your queries. For example, `if` statements and `for` loops. It also enables repeated SQL to be shared through `macros`. Read more about [Macros](/docs/build/jinja-macros).|
| Determine the order of model execution | Often, when transforming data, it makes sense to do so in a staged approach. dbt provides a mechanism to implement transformations in stages through the [ref function](/reference/dbt-jinja-functions/ref). Rather than selecting from existing tables and views in your warehouse, you can select from another model.|
-| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/collaborate/documentation).|
+| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/build/documentation).|
| Test your models | Tests provide a way to improve the integrity of the SQL in each model by making assertions about the results generated by a model. Build, test, and run your project with a button click or by using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) command bar. Read more about writing tests for your models [Testing](/docs/build/data-tests)|
| Manage packages | dbt ships with a package manager, which allows analysts to use and publish both public and private repositories of dbt code which can then be referenced by others. Read more about [Package Management](/docs/build/packages). |
| Load seed files| Often in analytics, raw values need to be mapped to a more readable value (for example, converting a country-code to a country name) or enriched with static or infrequently changing data. These data sources, known as seed files, can be saved as a CSV file in your `project` and loaded into your data warehouse using the `seed` command. Read more about [Seeds](/docs/build/seeds).|
diff --git a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
index f1e631f0d78..9e254de92d8 100644
--- a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
+++ b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
@@ -8,7 +8,7 @@ You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud
- **dbt Cloud**: A hosted application where you can develop directly from a web browser using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). It also natively supports developing using a command line interface, [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). Among other features, dbt Cloud provides:
- Development environment to help you build, test, run, and [version control](/docs/collaborate/git-version-control) your project faster.
- - Share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team.
+ - Share your [dbt project's documentation](/docs/build/documentation) with your team.
- Integrates with the dbt Cloud IDE, allowing you to run development tasks and environment in the dbt Cloud UI for a seamless experience.
- The dbt Cloud CLI to develop and run dbt commands against your dbt Cloud development environment from your local command line.
- For more details, refer to [Develop dbt](/docs/cloud/about-develop-dbt).
diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-cache.md b/website/docs/docs/use-dbt-semantic-layer/sl-cache.md
index e88c753ca82..4faa297f4ee 100644
--- a/website/docs/docs/use-dbt-semantic-layer/sl-cache.md
+++ b/website/docs/docs/use-dbt-semantic-layer/sl-cache.md
@@ -18,7 +18,7 @@ While you can use caching to speed up your queries and reduce compute time, know
## Prerequisites
- dbt Cloud [Team or Enterprise](https://www.getdbt.com/) plan.
-- dbt Cloud environments on dbt version 1.8 or higher. Or select [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version).
+- dbt Cloud environments that are versionless by opting to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version).
- A successful job run and [production environment](/docs/deploy/deploy-environments#set-as-production-environment).
- For declarative caching, you need to have [exports](/docs/use-dbt-semantic-layer/exports) defined in your [saved queries](/docs/build/saved-queries) YAML configuration file.
diff --git a/website/docs/faqs/Docs/_category_.yaml b/website/docs/faqs/Docs/_category_.yaml
index 8c7925dcc15..0a9aa44fe56 100644
--- a/website/docs/faqs/Docs/_category_.yaml
+++ b/website/docs/faqs/Docs/_category_.yaml
@@ -1,10 +1,10 @@
# position: 2.5 # float position is supported
-label: 'dbt Docs'
+label: 'Documentation'
collapsible: true # make the category collapsible
collapsed: true # keep the category collapsed by default
className: red
link:
type: generated-index
- title: dbt Docs FAQs
+ title: Documentation FAQs
customProps:
- description: Frequently asked questions about dbt Docs
+ description: Frequently asked questions about documentation
diff --git a/website/docs/faqs/Docs/long-descriptions.md b/website/docs/faqs/Docs/long-descriptions.md
index cdf15a94120..ef410df0517 100644
--- a/website/docs/faqs/Docs/long-descriptions.md
+++ b/website/docs/faqs/Docs/long-descriptions.md
@@ -31,4 +31,5 @@ If you need more than a sentence to explain a model, you can:
* tempor incididunt ut labore et dolore magna aliqua.
```
-3. Use a [docs block](/docs/collaborate/documentation#using-docs-blocks) to write the description in a separate Markdown file.
+3. Use a [docs block](/docs/build/documentation#using-docs-blocks) to write the description in a separate Markdown file.
+b
diff --git a/website/docs/faqs/Docs/sharing-documentation.md b/website/docs/faqs/Docs/sharing-documentation.md
index 4c6e0e84f77..cff618586ea 100644
--- a/website/docs/faqs/Docs/sharing-documentation.md
+++ b/website/docs/faqs/Docs/sharing-documentation.md
@@ -1,8 +1,12 @@
---
-title: How do I share my documentation with my team members?
+title: How do I access documentation in dbt Explorer?
description: "Use read-only seats to share documentation"
-sidebar_label: 'Share documentation with teammates'
+sidebar_label: 'Access documentation in dbt Explorer'
id: sharing-documentation
---
-If you're using dbt Cloud to deploy your project, and have the [Team plan](https://www.getdbt.com/pricing/), you can have up to 5 read-only users, who will be able access the documentation for your project.
+If you're using dbt Cloud to deploy your project and have the [Team or Enterprise plan](https://www.getdbt.com/pricing/), you can use dbt Explorer to view your project's [resources](/docs/build/projects) (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state.
+
+Access dbt Explorer in dbt Cloud by clicking the **Explore** link in the navigation. You can have up to 5 read-only users access the documentation for your project.
+
+dbt Cloud developer plan and dbt Core users can use [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), which generates basic documentation but it doesn't offer the same speed, metadata, or visibility as dbt Explorer.
diff --git a/website/docs/faqs/Git/github-permissions.md b/website/docs/faqs/Git/github-permissions.md
new file mode 100644
index 00000000000..e3a1740bbab
--- /dev/null
+++ b/website/docs/faqs/Git/github-permissions.md
@@ -0,0 +1,34 @@
+---
+title: "I'm seeing a 'GitHub and dbt Cloud latest permissions' error"
+description: "GitHub and dbt Cloud permissions error"
+sidebar_label: "GitHub and dbt Cloud permissions error"
+---
+
+If you see the error `This account needs to accept the latest permissions for the dbt Cloud GitHub App` in dbt Cloud — this usually occurs when the permissions for the dbt Cloud GitHub App are out of date.
+
+To solve this issue, you'll need to update the permissions for the dbt Cloud GitHub App in your GitHub account. Here's a couple of ways you can do it:
+
+#### Update permissions
+
+A Github organization admin will need to update the permissions in GitHub for the dbt Cloud GitHub App. If you're not the admin, reach out to your organization admin to request this. Alternatively, try [disconecting your GitHub account](#disconect-github) in dbt Cloud.
+
+1. Go directly to GitHub to determine if any updated permissions are required.
+2. In GitHub, go to your organization **Settings** (or personal if using a non-organization account).
+3. Then navigate to **Applications** to identify any necessary permission changes.
+For more info on GitHub permissions, refer to [access permissions](https://docs.github.com/en/get-started/learning-about-github/access-permissions-on-github).
+
+#### Disconnect GitHub
+
+Disconnect the GitHub and dbt Cloud integration in dbt Cloud.
+
+1. In dbt Cloud, go to **Account Settings**.
+2. In **Projects**, select the project that's experiencing the issue.
+3. Click the repository link under **Repository**.
+4. In the **Repository details** page, click **Edit**.
+5. Click **Disconnect** to remove the GitHub integration.
+6. Go back to your **Project details** page and reconnect your repository by clicking the **Configure Repository** link.
+7. Configure your repository and click **Save**
+
+
+
+If you've tried these workarounds and are still experiencing this behavior — reach out to the [dbt Support](mailto:support@getdbt.com) team and we'll be happy to help!
diff --git a/website/docs/faqs/Git/gitlab-authentication.md b/website/docs/faqs/Git/gitlab-authentication.md
index 0debdf87873..1d6de32fb6f 100644
--- a/website/docs/faqs/Git/gitlab-authentication.md
+++ b/website/docs/faqs/Git/gitlab-authentication.md
@@ -9,7 +9,7 @@ If you're seeing a 'GitLab Authentication is out of date' 500 server error page
No worries - this is a current issue the dbt Labs team is working on and we have a few workarounds for you to try:
-### 1st Workaround
+#### First workaround
1. Disconnect repo from project in dbt Cloud.
2. Go to Gitlab and click on Settings > Repository.
@@ -18,7 +18,7 @@ No worries - this is a current issue the dbt Labs team is working on and we have
5. You would then need to check Gitlab to make sure that the new deploy key is added.
6. Once confirmed that it's added, refresh dbt Cloud and try developing once again.
-### 2nd Workaround
+#### Second workaround
1. Keep repo in project as is -- don't disconnect.
2. Copy the deploy key generated in dbt Cloud.
diff --git a/website/docs/faqs/Git/run-on-pull.md b/website/docs/faqs/Git/run-on-pull.md
index 3536259bb79..d1b6bfd7524 100644
--- a/website/docs/faqs/Git/run-on-pull.md
+++ b/website/docs/faqs/Git/run-on-pull.md
@@ -12,4 +12,3 @@ If it was added via a deploy key method, you'll want to use the [GitHub auth me
To go ahead and enable 'Run on Pull requests', you'll want to remove dbt Cloud from the Apps & Integration on GitHub and re-integrate it again via the GitHub app method.
If you've tried the workaround above and are still experiencing this behavior - reach out to the Support team at support@getdbt.com and we'll be happy to help!
-
diff --git a/website/docs/guides/building-packages.md b/website/docs/guides/building-packages.md
index cc1ee2f1d74..69f963049ad 100644
--- a/website/docs/guides/building-packages.md
+++ b/website/docs/guides/building-packages.md
@@ -108,7 +108,7 @@ The major exception to this is when working with data sources that benefit from
### Test and document your package
It's critical that you [test](/docs/build/data-tests) your models and sources. This will give your end users confidence that your package is actually working on top of their dataset as intended.
-Further, adding [documentation](/docs/collaborate/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package.
+Further, adding [documentation](/docs/build/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package.
### Include useful GitHub artifacts
Over time, we've developed a set of useful GitHub artifacts that make administering our packages easier for us. In particular, we ensure that we include:
- A useful README, that has:
@@ -172,4 +172,4 @@ The release notes should contain an overview of the changes introduced in the ne
Our package registry, [hub.getdbt.com](https://hub.getdbt.com/), gets updated by the [hubcap script](https://github.com/dbt-labs/hubcap). To add your package to hub.getdbt.com, create a PR on the [hubcap repository](https://github.com/dbt-labs/hubcap) to include it in the `hub.json` file.
-
\ No newline at end of file
+
diff --git a/website/docs/guides/core-cloud-2.md b/website/docs/guides/core-cloud-2.md
index a4683ddb6f8..335b164d988 100644
--- a/website/docs/guides/core-cloud-2.md
+++ b/website/docs/guides/core-cloud-2.md
@@ -141,9 +141,9 @@ After [setting the foundations of dbt Cloud](https://docs.getdbt.com/guides/core
Once you’ve confirmed that dbt Cloud orchestration and CI/CD are working as expected, you should pause your current orchestration tool and stop or update your current CI/CD process. This is not relevant if you’re still using an external orchestrator (such as Airflow), and you’ve swapped out `dbt-core` execution for dbt Cloud execution (through the [API](/docs/dbt-cloud-apis/overview)).
Familiarize your team with dbt Cloud's [features](/docs/cloud/about-cloud/dbt-cloud-features) and optimize development and deployment processes. Some key features to consider include:
-- **Version management:** Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, removing the hassle of manual updates and version discrepancies. You can **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest fixes and early access to new functionality for your dbt project.
+- **Version management:** Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, removing the hassle of manual updates and version discrepancies. You can go versionless by opting to **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest features and early access to new functionality for your dbt project.
- **Development tools**: Use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to build, test, run, and version control your dbt projects.
-- **Documentation and Source freshness:** Automate storage of [documentation](/docs/collaborate/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance.
+- **Documentation and Source freshness:** Automate storage of [documentation](/docs/build/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance.
- **Notifications and logs:** Receive immediate [notifications](/docs/deploy/monitor-jobs) for job failures, with direct links to the job details. Access comprehensive logs for all job runs to help with troubleshooting.
- **CI/CD:** Use dbt Cloud's [CI/CD](/docs/deploy/ci-jobs) feature to run your dbt projects in a temporary schema whenever new commits are pushed to open pull requests. This helps with catching bugs before deploying to production.
diff --git a/website/docs/guides/core-to-cloud-1.md b/website/docs/guides/core-to-cloud-1.md
index 0a7dbf4dac8..6e130d3a29f 100644
--- a/website/docs/guides/core-to-cloud-1.md
+++ b/website/docs/guides/core-to-cloud-1.md
@@ -206,7 +206,7 @@ To use the [dbt Cloud's job scheduler](/docs/deploy/job-scheduler), set up one e
### Initial setup steps
1. **dbt Core version** — In your environment settings, configure dbt Cloud with the same dbt Core version.
- - Once your full migration is complete, we recommend upgrading your environments to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the latest features and more. You only need to do this once.
+ - Once your full migration is complete, we recommend upgrading your environments to a versionless experience by opting to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the latest features and more. You only need to do this once.
2. **Configure your jobs** — [Create jobs](/docs/deploy/deploy-jobs#create-and-schedule-jobs) for scheduled or event-driven dbt jobs. You can use cron execution, manual, pull requests, or trigger on the completion of another job.
- Note that alongside [jobs in dbt Cloud](/docs/deploy/jobs), discover other ways to schedule and run your dbt jobs with the help of other tools. Refer to [Integrate with other tools](/docs/deploy/deployment-tools) for more information.
diff --git a/website/docs/guides/core-to-cloud-3.md b/website/docs/guides/core-to-cloud-3.md
index 8e77ae8ab15..0b63756a41a 100644
--- a/website/docs/guides/core-to-cloud-3.md
+++ b/website/docs/guides/core-to-cloud-3.md
@@ -36,7 +36,7 @@ You may have already started your move to dbt Cloud and are looking for tips to
In dbt Cloud, you can natively connect to your data platform and test its [connection](/docs/connect-adapters) with a click of a button. This is especially useful for users who are new to dbt Cloud or are looking to streamline their connection setup. Here are some tips and caveats to consider:
### Tips
-- Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, eliminating the need for manual updates and version discrepancies. You can **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest fixes and early access to new functionality for your dbt project.
+- Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, eliminating the need for manual updates and version discrepancies. You can go versionless and **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest features and early access to new functionality for your dbt project.
- dbt Cloud supports a whole host of [cloud providers](/docs/cloud/connect-data-platform/about-connections), including Snowflake, Databricks, BigQuery, Fabric, and Redshift (to name a few).
- Use [Extended Attributes](/docs/deploy/deploy-environments#extended-attributes) to set a flexible [profiles.yml](/docs/core/connect-data-platform/profiles.yml) snippet in your dbt Cloud environment settings. It gives you more control over environments (both deployment and development) and extends how dbt Cloud connects to the data platform within a given environment.
- For example, if you have a field in your `profiles.yml` that you’d like to add to the dbt Cloud adapter user interface, you can use Extended Attributes to set it.
diff --git a/website/docs/guides/dbt-python-snowpark.md b/website/docs/guides/dbt-python-snowpark.md
index f6d54ee738f..8125f98d231 100644
--- a/website/docs/guides/dbt-python-snowpark.md
+++ b/website/docs/guides/dbt-python-snowpark.md
@@ -1858,7 +1858,7 @@ We are going to revisit 2 areas of our project to understand our documentation:
- `intermediate.md` file
- `dbt_project.yml` file
-To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/collaborate/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels.
+To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/build/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels.
1. To see all these pieces come together, execute this in the command bar:
@@ -1926,4 +1926,4 @@ Fantastic! You’ve finished the workshop! We hope you feel empowered in using b
For more help and information join our [dbt community Slack](https://www.getdbt.com/community/) which contains more than 50,000 data practitioners today. We have a dedicated slack channel #db-snowflake to Snowflake related content. Happy dbt'ing!
-
\ No newline at end of file
+
diff --git a/website/docs/guides/mesh-qs.md b/website/docs/guides/mesh-qs.md
index be6f2ca205e..d43e2516d23 100644
--- a/website/docs/guides/mesh-qs.md
+++ b/website/docs/guides/mesh-qs.md
@@ -40,7 +40,7 @@ To leverage dbt Mesh, you need the following:
- You must have a [dbt Cloud Enterprise account](https://www.getdbt.com/get-started/enterprise-contact-pricing)
- You have access to a cloud data platform, permissions to load the sample data tables, and dbt Cloud permissions to create new projects.
-- Set your development and deployment [environments](/docs/dbt-cloud-environments) to use dbt [version](/docs/dbt-versions/core) 1.6 or later. You can also opt [Keep on latest version of](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always use the latest version of dbt.
+- Set your development and deployment [environments](/docs/dbt-cloud-environments) to use dbt [version](/docs/dbt-versions/core) 1.6 or later. You can also opt to go versionless and select [Keep on latest version of](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the most recent features and functionality.
- This guide uses the Jaffle Shop sample data, including `customers`, `orders`, and `payments` tables. Follow the provided instructions to load this data into your respective data platform:
- [Snowflake](https://docs.getdbt.com/guides/snowflake?step=3)
- [Databricks](https://docs.getdbt.com/guides/databricks?step=3)
diff --git a/website/docs/guides/productionize-your-dbt-databricks-project.md b/website/docs/guides/productionize-your-dbt-databricks-project.md
index 33f25070bdb..bada787e01f 100644
--- a/website/docs/guides/productionize-your-dbt-databricks-project.md
+++ b/website/docs/guides/productionize-your-dbt-databricks-project.md
@@ -197,4 +197,4 @@ To get the most out of both tools, you can use the [persist docs config](/refere
- [Databricks + dbt Cloud Quickstart Guide](/guides/databricks)
- Reach out to your Databricks account team to get access to preview features on Databricks.
-
\ No newline at end of file
+
diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md
index 4310710383c..9fb42fe1828 100644
--- a/website/docs/guides/sl-snowflake-qs.md
+++ b/website/docs/guides/sl-snowflake-qs.md
@@ -114,7 +114,7 @@ Open a new tab and follow these quick steps for account setup and data loading i
-- Production and development environments must be on [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-dbt-version-in-cloud). Alternatively, set your environment to[ Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always remain on the latest version.
+- Production and development environments must be on [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-dbt-version-in-cloud). Alternatively, set your environment to "versionless" by selecting [ Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the latest updates.
- Create a [trial Snowflake account](https://signup.snowflake.com/):
- Select the Enterprise Snowflake edition with ACCOUNTADMIN access. Consider organizational questions when choosing a cloud provider, refer to Snowflake's [Introduction to Cloud Platforms](https://docs.snowflake.com/en/user-guide/intro-cloud-platforms).
- Select a cloud provider and region. All cloud providers and regions will work so choose whichever you prefer.
@@ -698,10 +698,10 @@ semantic_models:
type: foreign
# Newly added
dimensions:
- - name: order_date
- type: time
- type_params:
- time_granularity: day
+ - name: order_date
+ type: time
+ type_params:
+ time_granularity: day
```