diff --git a/website/blog/2021-02-05-dbt-project-checklist.md b/website/blog/2021-02-05-dbt-project-checklist.md index 9820c279b0f..efa7ca61b0e 100644 --- a/website/blog/2021-02-05-dbt-project-checklist.md +++ b/website/blog/2021-02-05-dbt-project-checklist.md @@ -173,8 +173,8 @@ This post is the checklist I created to guide our internal work, and I’m shari Useful Links -* [FAQs for documentation](/docs/collaborate/documentation#faqs) -* [Doc blocks](/docs/collaborate/documentation#using-docs-blocks) +* [FAQs for documentation](/docs/build/documentation#faqs) +* [Doc blocks](/docs/build/documentation#using-docs-blocks) ## ✅ dbt Cloud specifics ---------------------------------------------------------------------------------------------------------------------------------------------------------- diff --git a/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md b/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md index 52b2746ca14..2375d31d448 100644 --- a/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md +++ b/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md @@ -87,7 +87,7 @@ The most important thing we’re introducing when your project is an infant is t * Introduce modularity with [{{ ref() }}](/reference/dbt-jinja-functions/ref) and [{{ source() }}](/reference/dbt-jinja-functions/source) -* [Document](/docs/collaborate/documentation) and [test](/docs/build/data-tests) your first models +* [Document](/docs/build/documentation) and [test](/docs/build/data-tests) your first models ![image alt text](/img/blog/building-a-mature-dbt-project-from-scratch/image_3.png) diff --git a/website/blog/2022-09-28-analyst-to-ae.md b/website/blog/2022-09-28-analyst-to-ae.md index bf19bbae59e..03a466ddf80 100644 --- a/website/blog/2022-09-28-analyst-to-ae.md +++ b/website/blog/2022-09-28-analyst-to-ae.md @@ -133,7 +133,7 @@ It’s much easier to keep to a naming guide when the writer has a deep understa If we want to know how certain logic was built technically, then we can reference the SQL code in dbt docs. If we want to know *why* a certain logic was built into that specific model, then that’s where we’d turn to the documentation. -- Example of not-so-helpful documentation ([dbt docs can](https://docs.getdbt.com/docs/collaborate/documentation) build this dynamically): +- Example of not-so-helpful documentation ([dbt docs can](https://docs.getdbt.com/docs/build/documentation) build this dynamically): - `Case when Zone = 1 and Level like 'A%' then 'True' else 'False' end as GroupB` - Example of better, more descriptive documentation (add to your dbt markdown file or column descriptions): - Group B is defined as Users in Zone 1 with a Level beginning with the letter 'A'. These users are accessing our new add-on product that began in Beta in August 2022. It's recommended to filter them out of the main Active Users metric. diff --git a/website/blog/2023-02-14-passing-the-dbt-certification-exam.md b/website/blog/2023-02-14-passing-the-dbt-certification-exam.md index dbd0b856fe9..2696f3550f7 100644 --- a/website/blog/2023-02-14-passing-the-dbt-certification-exam.md +++ b/website/blog/2023-02-14-passing-the-dbt-certification-exam.md @@ -25,7 +25,7 @@ In this article, two Montreal Analytics consultants, Jade and Callie, discuss th **J:** To prepare for the exam, I built up a practice dbt project. All consultants do this as part of Montreal Analytics onboarding process, and this project allowed me to practice implementing sources and tests, refactoring SQL models, and debugging plenty of error messages. Additionally, I reviewed the [Certification Study Guide](https://www.getdbt.com/assets/uploads/dbt_certificate_study_guide.pdf) and attended group learning sessions. -**C:** To prepare for the exam I reviewed the official dbt Certification Study Guide and the [official dbt docs](https://docs.getdbt.com/), and attended group study and learning sessions that were hosted by Montreal Analytics for all employees interested in taking the exam. As a group, we prioritized subjects that we felt less familiar with; for the first cohort of test takers this was mainly newer topics that haven’t yet become integral to a typical dbt project, such as [doc blocks](https://docs.getdbt.com/docs/collaborate/documentation#using-docs-blocks) and [configurations versus properties](https://docs.getdbt.com/reference/configs-and-properties). These sessions mainly covered the highlights and common “gotchas” that are experienced using these techniques. The sessions were moderated by a team member who had already successfully completed the dbt Certification, but operated in a very collaborative environment, so everyone could provide additional information, ask questions to the group, and provide feedback to other members of our certification taking group. +**C:** To prepare for the exam I reviewed the official dbt Certification Study Guide and the [official dbt docs](https://docs.getdbt.com/), and attended group study and learning sessions that were hosted by Montreal Analytics for all employees interested in taking the exam. As a group, we prioritized subjects that we felt less familiar with; for the first cohort of test takers this was mainly newer topics that haven’t yet become integral to a typical dbt project, such as [doc blocks](https://docs.getdbt.com/docs/build/documentation#using-docs-blocks) and [configurations versus properties](https://docs.getdbt.com/reference/configs-and-properties). These sessions mainly covered the highlights and common “gotchas” that are experienced using these techniques. The sessions were moderated by a team member who had already successfully completed the dbt Certification, but operated in a very collaborative environment, so everyone could provide additional information, ask questions to the group, and provide feedback to other members of our certification taking group. I felt comfortable with the breadth of my dbt knowledge and had familiarity with most topics. However in my day-to-day implementation, I am often reliant on documentation or copying and pasting specific configurations in order to get the correct settings. Therefore, my focus was on memorizing important criteria for *how to use* certain features, particularly on the order/nesting of how the key YAML files are configured (dbt_project.yml, table.yml, source.yml). @@ -75,4 +75,4 @@ Now, the first thing you must do when you’ve passed a test is to get yourself Standards and best practices are very important, but a test is a measure at a single point in time of a rapidly evolving industry. It’s also a measure of my test-taking abilities, my stress levels, and other things unrelated to my skill in data modeling; I wouldn’t be a good analyst if I didn’t recognize the faults of a measurement. I’m glad to have this check mark completed, but I will continue to stay up to date with changes, learn new data skills and techniques, and find ways to continue being a holistically helpful teammate to my colleagues and clients. -You can learn more about the dbt Certification [here](https://www.getdbt.com/blog/dbt-certification-program/). \ No newline at end of file +You can learn more about the dbt Certification [here](https://www.getdbt.com/blog/dbt-certification-program/). diff --git a/website/blog/2023-05-04-generating-dynamic-docs.md b/website/blog/2023-05-04-generating-dynamic-docs.md index 1e704178b0a..f41302144dc 100644 --- a/website/blog/2023-05-04-generating-dynamic-docs.md +++ b/website/blog/2023-05-04-generating-dynamic-docs.md @@ -215,7 +215,7 @@ Which in turn can be copy-pasted into a new `.yml` file. In our example, we writ ## Create docs blocks for the new columns -[Docs blocks](https://docs.getdbt.com/docs/collaborate/documentation#using-docs-blocks) can be utilized to write more DRY and robust documentation. To use docs blocks, update your folder structure to contain a `.md` file. Your file structure should now look like this: +[Docs blocks](https://docs.getdbt.com/docs/build/documentation#using-docs-blocks) can be utilized to write more DRY and robust documentation. To use docs blocks, update your folder structure to contain a `.md` file. Your file structure should now look like this: ``` models/core/activity_based_interest diff --git a/website/blog/2024-06-12-putting-your-dag-on-the-internet.md b/website/blog/2024-06-12-putting-your-dag-on-the-internet.md new file mode 100644 index 00000000000..8d0bc79e35d --- /dev/null +++ b/website/blog/2024-06-12-putting-your-dag-on-the-internet.md @@ -0,0 +1,119 @@ +--- +title: Putting Your DAG on the internet +description: "Use dbt and Snowflake's external access integrations to allow Snowflake Python models access the internet." +slug: dag-on-the-internet + +authors: [ernesto_ongaro, sebastian_stan, filip_byrén] + +tags: [analytics craft, APIs, data ecosystem] +hide_table_of_contents: false + +date: 2024-06-14 +is_featured: true +--- + +**New in dbt: allow Snowflake Python models to access the internet** + +With dbt 1.8, dbt released support for Snowflake’s [external access integrations](https://docs.snowflake.com/en/developer-guide/external-network-access/external-network-access-overview) further enabling the use of dbt + AI to enrich your data. This allows querying of external APIs within dbt Python models, a functionality that was required for dbt Cloud customer, [EQT AB](https://eqtgroup.com/). Learn about why they needed it and how they helped build the feature and get it shipped! + + +## Why did EQT require this functionality? +by Filip Bryén, VP and Software Architect (EQT) and Sebastian Stan, Data Engineer (EQT) + +_EQT AB is a global investment organization and as a long-term customer of dbt Cloud, presented at dbt’s Coalesce [2020](https://www.getdbt.com/coalesce-2020/seven-use-cases-for-dbt) and [2023](https://www.youtube.com/watch?v=-9hIUziITtU)._ + +_Motherbrain Labs is EQT’s bespoke AI team, primarily focused on accelerating our portfolio companies' roadmaps through hands-on data and AI work. Due to the high demand for our time, we are constantly exploring mechanisms for simplifying our processes and increasing our own throughput. Integration of workflow components directly in dbt has been a major efficiency gain and helped us rapidly deliver across a global portfolio._ + +Motherbrain Labs is focused on creating measurable AI impact in our portfolio. We work hand-in-hand with leadership from our deal teams and portfolio company leadership but our starting approach is always the same: identify which data matters. + +While we have access to reams of proprietary information, we believe the greatest effect happens when we combine that information with external datasets like geolocation, demographics, or competitor traction. + +These valuable datasets often come from third-party vendors who operate on a pay-per-use model; a single charge for every piece of information we want. To avoid overspending, we focus on enriching only the specific subset of data that is relevant to an individual company's strategic question. + +In response to this recurring need, we have partnered with Snowflake and dbt to introduce new functionality that facilitates communication with external endpoints and manages secrets within dbt. This new integration enables us to incorporate enrichment processes directly into our DAGs, similar to how current Python models are utilized within dbt environments. We’ve found that this augmented approach allows us to reduce complexity and enable external communications before materialization. + +## An example with Carbon Intensity: How does it work? + +In this section, we will demonstrate how to integrate an external API to retrieve the current Carbon Intensity of the UK power grid. The goal is to illustrate how the feature works, and perhaps explore how the scheduling of data transformations at different times can potentially reduce their carbon footprint, making them a greener choice. We will be leveraging the API from the [UK National Grid ESO](https://www.nationalgrideso.com/) to achieve this. + +To start, we need to set up a network rule (Snowflake instructions [here](https://docs.snowflake.com/en/user-guide/network-rules)) to allow access to the external API. Specifically, we'll create an egress rule to permit Snowflake to communicate with api.carbonintensity.org. + +Next, to access network locations outside of Snowflake, you need to define an external access integration first and reference it within a dbt Python model. You can find an overview of Snowflake's external network access [here](https://docs.snowflake.com/en/developer-guide/external-network-access/external-network-access-overview). + +This API is open and if it requires an API key, handle it similarly to managing secrets. More information on API authentication in Snowflake is available [here](https://docs.snowflake.com/en/user-guide/api-authentication). + +For simplicity’s sake, we will show how to create them using [pre-hooks](/reference/resource-configs/pre-hook-post-hook) in a model configuration yml file: + + +``` +models: + - name: external_access_sample + config: + pre_hook: + - "create or replace network rule test_network_rule type = host_port mode = egress value_list= ('api.carbonintensity.org.uk:443');" + - "create or replace external access integration test_external_access_integration allowed_network_rules = (test_network_rule) enabled = true;" +``` + +Then we can simply use the new external_access_integrations configuration parameter to use our network rule within a Python model (called external_access_sample.py): + + +``` +import snowflake.snowpark as snowpark +def model(dbt, session: snowpark.Session): + dbt.config( + materialized="table", + external_access_integrations=["test_external_access_integration"], + packages=["httpx==0.26.0"] + ) + import httpx + return session.create_dataframe( + [{"carbon_intensity": httpx.get(url="https://api.carbonintensity.org.uk/intensity").text}] + ) +``` + + +The result is a model with some json I can parse, for example, in a SQL model to extract some information: + + +``` +{{ + config( + materialized='incremental', + unique_key='dbt_invocation_id' + ) +}} + +with raw as ( + select parse_json(carbon_intensity) as carbon_intensity_json + from {{ ref('external_access_demo') }} +) + +select + '{{ invocation_id }}' as dbt_invocation_id, + value:from::TIMESTAMP_NTZ as start_time, + value:to::TIMESTAMP_NTZ as end_time, + value:intensity.actual::NUMBER as actual_intensity, + value:intensity.forecast::NUMBER as forecast_intensity, + value:intensity.index::STRING as intensity_index +from raw, + lateral flatten(input => raw.carbon_intensity_json:data) +``` + + +The result is a model that will keep track of dbt invocations, and the current UK carbon intensity levels. + + + +## dbt best practices + +This is a very new area to Snowflake and dbt -- something special about SQL and dbt is that it’s very resistant to external entropy. The second we rely on API calls, Python packages and other external dependencies, we open up to a lot more external entropy. APIs will change, break, and your models could fail. + +Traditionally dbt is the T in ELT (dbt overview [here](https://docs.getdbt.com/terms/elt)), and this functionality unlocks brand new EL capabilities for which best practices do not yet exist. What’s clear is that EL workloads should be separated from T workloads, perhaps in a different modeling layer. Note also that unless using incremental models, your historical data can easily be deleted. dbt has seen a lot of use cases for this, including this AI example as outlined in this external [engineering blog post](https://klimmy.hashnode.dev/enhancing-your-dbt-project-with-large-language-models). + +**A few words about the power of Commercial Open Source Software** + +In order to get this functionality shipped quickly, EQT opened a pull request, Snowflake helped with some problems we had with CI and a member of dbt Labs helped write the tests and merge the code in! + +dbt now features this functionality in dbt 1.8+ or on “Keep on latest version” option of dbt Cloud (dbt overview [here](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)). + +dbt Labs staff and community members would love to chat more about it in the [#db-snowflake](https://getdbt.slack.com/archives/CJN7XRF1B) slack channel. diff --git a/website/blog/authors.yml b/website/blog/authors.yml index 70d665ce944..64727801478 100644 --- a/website/blog/authors.yml +++ b/website/blog/authors.yml @@ -614,3 +614,30 @@ anders_swanson: links: - icon: fa-linkedin url: https://www.linkedin.com/in/andersswanson + +ernesto_ongaro: + image_url: /img/blog/authors/ernesto-ongaro.png + job_title: Senior Solutions Architect + name: Ernesto Ongaro + organization: dbt Labs + links: + - icon: fa-linkedin + url: https://www.linkedin.com/in/eongaro + +sebastian_stan: + image_url: /img/blog/authors/sebastian-eqt.png + job_title: Data Engineer + name: Sebastian Stan + organization: EQT Group + links: + - icon: fa-linkedin + url: https://www.linkedin.com/in/sebastian-lindblom/ + +filip_byrén: + image_url: /img/blog/authors/filip-eqt.png + job_title: VP and Software Architect + name: Filip Byrén + organization: EQT Group + links: + - icon: fa-linked + url: https://www.linkedin.com/in/filip-byr%C3%A9n/ diff --git a/website/docs/best-practices/how-we-structure/5-semantic-layer-marts.md b/website/docs/best-practices/how-we-structure/5-semantic-layer-marts.md index 62e07a72e36..7694c9a94af 100644 --- a/website/docs/best-practices/how-we-structure/5-semantic-layer-marts.md +++ b/website/docs/best-practices/how-we-structure/5-semantic-layer-marts.md @@ -3,7 +3,7 @@ title: "Marts for the Semantic Layer" id: "5-semantic-layer-marts" --- -The Semantic Layer alters some fundamental principles of how you organize your project. Using dbt without the Semantic Layer necessitates creating the most useful combinations of your building block components into wide, denormalized marts. On the other hand, the Semantic Layer leverages MetricFlow to denormalize every possible combination of components we've encoded dynamically. As such we're better served to bring more normalized models through from the logical layer into the Semantic Layer to maximize flexibility. This section will assume familiarity with the best practices laid out in the [How we build our metrics](/best-practices/how-we-build-our-metrics/semantic-layer-1-intro) guide, so check that out first for a more hands-on introduction to the Semantic Layer. +The [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) alters some fundamental principles of how you organize your project. Using dbt without the Semantic Layer necessitates creating the most useful combinations of your building block components into wide, denormalized marts. On the other hand, the Semantic Layer leverages MetricFlow to denormalize every possible combination of components we've encoded dynamically. As such we're better served to bring more normalized models through from the logical layer into the Semantic Layer to maximize flexibility. This section will assume familiarity with the best practices laid out in the [How we build our metrics](/best-practices/how-we-build-our-metrics/semantic-layer-1-intro) guide, so check that out first for a more hands-on introduction to the Semantic Layer. ## Semantic Layer: Files and folders @@ -36,6 +36,40 @@ models └── stg_supplies.yml ``` +## Semantic Layer: Where and why? + +- 📂 **Directory structure**: Add your semantic models to `models/semantic_models` with directories corresponding to the models/marts files. This type of organization makes it easier to search and find what you can join. It also supports better maintenance and reduces repeated code. + + + + ```yaml + semantic_models: + - name: orders + defaults: + agg_time_dimension: order_date + description: | + Order fact table. This table’s grain is one row per order. + model: ref('fct_orders') + entities: + - name: order_id + type: primary + - name: customer_id + type: foreign + dimensions: + - name: order_date + type: time + type_params: + time_granularity: day + ``` + + +## Naming convention + +- 🏷️ **Semantic model names**: Use the `sem_` prefix for semantic model names, such as `sem_cloud_user_account_activity`. This follows the same pattern as other naming conventions like `fct_` for fact tables and `dim_` for dimension tables. +- 🧩 **Entity names**: Don't use prefixes in Entity within the semantic model. This keeps the names clear and focused on their specific purpose without unnecessary prefixes. + +This guidance helps you make sure your dbt project is organized, maintainable, and scalable, allowing you to take full advantage of the capabilities offered by the dbt Semantic Layer. + ## When to make a mart - ❓ If we can go directly to staging models and it's better to serve normalized models to the Semantic Layer, then when, where, and why would we make a mart? diff --git a/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md b/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md index 4082f92b932..8e38648e43a 100644 --- a/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md +++ b/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md @@ -50,7 +50,7 @@ When structuring your YAML configuration files in a dbt project, you want to bal - The leading underscore ensures your YAML files will be sorted to the top of every folder to make them easy to separate from your models. - YAML files don’t need unique names in the way that SQL model files do, but including the directory (instead of simply `_sources.yml` in each folder), means you can fuzzy find the right file more quickly. - We’ve recommended several different naming conventions over the years, most recently calling these `schema.yml` files. We’ve simplified to recommend that these simply be labelled based on the YAML dictionary that they contain. - - If you utilize [doc blocks](https://docs.getdbt.com/docs/collaborate/documentation#using-docs-blocks) in your project, we recommend following the same pattern, and creating a `_[directory]__docs.md` markdown file per directory containing all your doc blocks for that folder of models. + - If you utilize [doc blocks](https://docs.getdbt.com/docs/build/documentation#using-docs-blocks) in your project, we recommend following the same pattern, and creating a `_[directory]__docs.md` markdown file per directory containing all your doc blocks for that folder of models. - ❌ **Config per project.** Some people put _all_ of their source and model YAML into one file. While you can technically do this, and while it certainly simplifies knowing what file the config you’re looking for will be in (as there is only one file), it makes it much harder to find specific configurations within that file. We recommend balancing those two concerns. - ⚠️ **Config per model.** On the other end of the spectrum, some people prefer to create one YAML file per model. This presents less of an issue than a single monolith file, as you can quickly search for files, know exactly where specific configurations exist, spot models without configs (and thus without tests) by looking at the file tree, and various other advantages. In our opinion, the extra files, tabs, and windows this requires creating, copying from, pasting to, closing, opening, and managing creates a somewhat slower development experience that outweighs the benefits. Defining config per directory is the most balanced approach for most projects, but if you have compelling reasons to use config per model, there are definitely some great projects that follow this paradigm. - ✅ **Cascade configs.** Leverage your `dbt_project.yml` to set default configurations at the directory level. Use the well-organized folder structure we’ve created thus far to define the baseline schemas and materializations, and use dbt’s cascading scope priority to define variations to this. For example, as below, define your marts to be materialized as tables by default, define separate schemas for our separate subfolders, and any models that need to use incremental materialization can be defined at the model level. diff --git a/website/docs/docs/collaborate/documentation.md b/website/docs/docs/build/documentation.md similarity index 68% rename from website/docs/docs/collaborate/documentation.md rename to website/docs/docs/build/documentation.md index 6771f88a8d4..00ae02918b2 100644 --- a/website/docs/docs/collaborate/documentation.md +++ b/website/docs/docs/build/documentation.md @@ -1,11 +1,12 @@ --- -title: "About documentation" +title: "Documentation" description: "Learn how good documentation for your dbt models helps stakeholders discover and understand your datasets." id: "documentation" -pagination_next: "docs/collaborate/build-and-view-your-docs" -pagination_prev: null --- +Good documentation for your dbt models will help downstream consumers discover and understand the datasets you curate for them. +dbt provides a way to generate documentation for your dbt project and render it as a website. + ## Related documentation * [Declaring properties](/reference/configs-and-properties) @@ -19,18 +20,12 @@ pagination_prev: null ## Overview -Good documentation for your dbt models will help downstream consumers discover and understand the datasets which you curate for them. - -dbt provides a way to generate documentation for your dbt project and render it as a website. The documentation for your project includes: +dbt provides a way to generate documentation for your dbt project. The documentation for your project includes: * **Information about your project**: including model code, a DAG of your project, any tests you've added to a column, and more. * **Information about your **: including column data types, and sizes. This information is generated by running queries against the information schema. Importantly, dbt also provides a way to add **descriptions** to models, columns, sources, and more, to further enhance your documentation. -Here's an example docs site: - - - ## Adding descriptions to your project To add descriptions to your project, use the `description:` key in the same files where you declare [tests](/docs/build/data-tests), like so: @@ -60,13 +55,19 @@ models: - ## Generating project documentation -You can generate a documentation site for your project (with or without descriptions) using the CLI. -First, run `dbt docs generate` — this command tells dbt to compile relevant information about your dbt project and warehouse into `manifest.json` and `catalog.json` files respectively. To see the documentation for all columns and not just columns described in your project, ensure that you have created the models with `dbt run` beforehand. +The default documentation experience in dbt Cloud is [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). Use dbt Explorer to view your project's resources (such as models, tests, and metrics), its [metadata](/docs/collaborate/explore-projects#generate-metadata), and their lineage to gain a better understanding of its latest production state. -Then, run `dbt docs serve` to use these `.json` files to populate a local website. +dbt Cloud developer and dbt Core users can use [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), which generates basic documentation, but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. + +Generate documentation for you project by following these steps: + +1. Run `dbt docs generate` — this command tells dbt to compile relevant information about your dbt project and warehouse into `manifest.json` and `catalog.json` files, respectively. +2. Ensure that you have created the models with `dbt run` to view the documentation for all columns, not just those described in your project. +3. Run `dbt docs serve` if you're developing locally to use these `.json` files to populate a local website. + +To view a resource, its metadata, and what commands are needed in dbt Explorer, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details. ## FAQs @@ -75,8 +76,7 @@ Then, run `dbt docs serve` to use these `.json` files to populate a local websit - -## Using Docs Blocks +## Using docs blocks ### Syntax To declare a docs block, use the jinja `docs` tag. Docs blocks can contain arbitrary markdown, but they must be uniquely named. Their names may contain uppercase and lowercase letters (A-Z, a-z), digits (0-9), and underscores (_), but can't start with a digit. @@ -128,9 +128,11 @@ models: In the resulting documentation, `'{{ doc("table_events") }}'` will be expanded to the markdown defined in the `table_events` docs block. + ## Setting a custom overview +*Currently available for dbt Docs only.* -The "overview" shown in the documentation website can be overridden by supplying your own docs block called `__overview__`. By default, dbt supplies an overview with helpful information about the docs site itself. Depending on your needs, it may be a good idea to override this docs block with specific information about your company style guide, links to reports, or information about who to contact for help. To override the default overview, create a docs block that looks like this: +The "overview" shown in the dbt Docs website can be overridden by supplying your own docs block called `__overview__`. By default, dbt supplies an overview with helpful information about the docs site itself. Depending on your needs, it may be a good idea to override this docs block with specific information about your company style guide, links to reports, or information about who to contact for help. To override the default overview, create a docs block that looks like this: @@ -148,6 +150,7 @@ as well as the repo for this project \[here](https://github.com/dbt-labs/mrr-pla ### Custom project-level overviews +*Currently available for dbt Docs only.* You can set different overviews for each dbt project/package included in your documentation site by creating a docs block named `__[project_name]__`. For example, in order to define @@ -174,13 +177,21 @@ up to page views and sessions. ## Navigating the documentation site -Using the docs interface, you can navigate to the documentation for a specific model. That might look something like this: + +Use [dbt Explorer](/docs/collaborate/explore-projects) for a richer documentation experience and more interactive experience for understanding your project's resources and lineage. Available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). + +For additional details on how to explore your lineage and navigate your resources, refer to [dbt Explorer](/docs/collaborate/explore-projects). + + + + +If you're using the dbt Docs interface, you can navigate to the documentation for a specific model. That might look something like this: Here, you can see a representation of the project structure, a markdown description for a model, and a list of all of the columns (with documentation) in the model. -From a docs page, you can click the green button in the bottom-right corner of the webpage to expand a "mini-map" of your DAG. This pane (shown below) will display the immediate parents and children of the model that you're exploring. +From the dbt Docs page, you can click the green button in the bottom-right corner of the webpage to expand a "mini-map" of your DAG. This pane (shown below) will display the immediate parents and children of the model that you're exploring. @@ -188,17 +199,24 @@ In this example, the `fct_subscription_transactions` model only has one direct p + + ## Deploying the documentation site +With dbt Cloud, use [dbt Explorer](/docs/collaborate/explore-projects) automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project — meaning it's always automatically updated after each job run. + :::caution Security The `dbt docs serve` command is only intended for local/development hosting of the documentation site. Please use one of the methods listed below (or similar) to ensure that your documentation site is hosted securely! ::: +#### For dbt Docs users + dbt's documentation website was built to make it easy to host on the web. The site is "static,” meaning you don't need any "dynamic" servers to serve the docs. You can host your documentation in several ways: -* Use [dbt Cloud](/docs/collaborate/documentation) +* Use [dbt Cloud's](/docs/collaborate/build-and-view-your-docs) default documentation experience with [dbt Explorer](/docs/collaborate/explore-projects). * Host on [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html) (optionally [with IP access restrictions](https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html#example-bucket-policies-use-case-3)) * Publish with [Netlify](https://discourse.getdbt.com/t/publishing-dbt-docs-to-netlify/121) * Use your own web server like Apache/Nginx + diff --git a/website/docs/docs/build/environment-variables.md b/website/docs/docs/build/environment-variables.md index a8784847f33..d009cb9663a 100644 --- a/website/docs/docs/build/environment-variables.md +++ b/website/docs/docs/build/environment-variables.md @@ -156,9 +156,9 @@ Env vars works fine with username/password and keypair, including scheduled jobs However, there are some limitations when using env vars with Snowflake OAuth Connection settings: -- You can't use them in the account/host field, but they can be used for database, warehouse, and role. +- You can't use them in the account/host field, but they can be used for database, warehouse, and role. For these fields, [use extended attributes](/docs/deploy/deploy-environments#deployment-connection). -Something to note, if you supply an environment variable in the account/host field, Snowflake OAuth Connection will **fail** to connect. This happens because the field doesn't pass through Jinja rendering, so dbt Cloud simply passes the literal `env_var` code into a URL string like `{{ env_var("DBT_ACCOUNT_HOST_NAME") }}.snowflakecomputing.com`, which is an invalid hostname. +Something to note, if you supply an environment variable in the account/host field, Snowflake OAuth Connection will **fail** to connect. This happens because the field doesn't pass through Jinja rendering, so dbt Cloud simply passes the literal `env_var` code into a URL string like `{{ env_var("DBT_ACCOUNT_HOST_NAME") }}.snowflakecomputing.com`, which is an invalid hostname. Use [extended attributes](/docs/deploy/deploy-environments#deployment-credentials) instead. ::: #### Audit your run metadata diff --git a/website/docs/docs/build/exposures.md b/website/docs/docs/build/exposures.md index bcbe819d98c..0daf44b1c4c 100644 --- a/website/docs/docs/build/exposures.md +++ b/website/docs/docs/build/exposures.md @@ -6,7 +6,7 @@ id: "exposures" Exposures make it possible to define and describe a downstream use of your dbt project, such as in a dashboard, application, or data science pipeline. By defining exposures, you can then: - run, test, and list resources that feed into your exposure -- populate a dedicated page in the auto-generated [documentation](/docs/collaborate/documentation) site with context relevant to data consumers +- populate a dedicated page in the auto-generated [documentation](/docs/build/documentation) site with context relevant to data consumers ### Declaring an exposure diff --git a/website/docs/docs/build/metrics-overview.md b/website/docs/docs/build/metrics-overview.md index be1b7d51c94..3108f30c374 100644 --- a/website/docs/docs/build/metrics-overview.md +++ b/website/docs/docs/build/metrics-overview.md @@ -220,7 +220,9 @@ metrics: ## Filters -A filter is configured using Jinja templating. Use the following syntax to reference entities, dimensions, time dimensions, or metrics in filters and refer to [Metrics as dimensions](/docs/build/ref-metrics-in-filters) for details on how to use metrics as dimensions with metric filters: +A filter is configured using Jinja templating. Use the following syntax to reference entities, dimensions, time dimensions, or metrics in filters. + +Refer to [Metrics as dimensions](/docs/build/ref-metrics-in-filters) for details on how to use metrics as dimensions with metric filters: ```yaml filter: | @@ -233,7 +235,7 @@ filter: | {{ TimeDimension('time_dimension', 'granularity') }} filter: | - {{ Metric('metric_name', group_by=['entity_name']) }} + {{ Metric('metric_name', group_by=['entity_name']) }} # Available in v1.8 or go versionless with [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) ``` ### Further configuration diff --git a/website/docs/docs/build/projects.md b/website/docs/docs/build/projects.md index 45b623dc550..a65d4773ac6 100644 --- a/website/docs/docs/build/projects.md +++ b/website/docs/docs/build/projects.md @@ -16,7 +16,7 @@ At a minimum, all a project needs is the `dbt_project.yml` project configuration | [seeds](/docs/build/seeds) | CSV files with static data that you can load into your data platform with dbt. | | [data tests](/docs/build/data-tests) | SQL queries that you can write to test the models and resources in your project. | | [macros](/docs/build/jinja-macros) | Blocks of code that you can reuse multiple times. | -| [docs](/docs/collaborate/documentation) | Docs for your project that you can build. | +| [docs](/docs/build/documentation) | Docs for your project that you can build. | | [sources](/docs/build/sources) | A way to name and describe the data loaded into your warehouse by your Extract and Load tools. | | [exposures](/docs/build/exposures) | A way to define and describe a downstream use of your project. | | [metrics](/docs/build/build-metrics-intro) | A way for you to define metrics for your project. | diff --git a/website/docs/docs/build/saved-queries.md b/website/docs/docs/build/saved-queries.md index 9062734c856..6261a56ed08 100644 --- a/website/docs/docs/build/saved-queries.md +++ b/website/docs/docs/build/saved-queries.md @@ -6,7 +6,7 @@ sidebar_label: "Saved queries" tags: [Metrics, Semantic Layer] --- -Saved queries are a way to save commonly used queries in MetricFlow. You can group metrics, dimensions, and filters that are logically related into a saved query. Saved queries is a node and visible in the dbt . +Saved queries are a way to save commonly used queries in MetricFlow. You can group metrics, dimensions, and filters that are logically related into a saved query. Saved queries are nodes and visible in the dbt . Saved queries serve as the foundational building block, allowing you to [configure exports](#configure-exports) in your saved query configuration. Exports takes this functionality a step further by enabling you to [schedule and write saved queries](/docs/use-dbt-semantic-layer/exports) directly within your data platform using [dbt Cloud's job scheduler](/docs/deploy/job-scheduler). diff --git a/website/docs/docs/build/semantic-models.md b/website/docs/docs/build/semantic-models.md index 627d95c1636..1cb21a3144e 100644 --- a/website/docs/docs/build/semantic-models.md +++ b/website/docs/docs/build/semantic-models.md @@ -15,7 +15,7 @@ Semantic models are the foundation for data definition in MetricFlow, which powe - MetricFlow uses YAML configuration files to create this graph for querying metrics. - Each semantic model corresponds to a dbt model in your DAG, requiring a unique YAML configuration for each semantic model. - You can create multiple semantic models from a single dbt model (SQL or Python), as long as you give each semantic model a unique name. -- Configure semantic models in a YAML file within your dbt project directory. +- Configure semantic models in a YAML file within your dbt project directory. Refer to the [best practices guide](/best-practices/how-we-structure/5-semantic-layer-marts) for more info on project structuring. - Organize them under a `metrics:` folder or within project sources as needed. @@ -60,6 +60,8 @@ semantic_models: if the semantic model has no primary entity, then this property is required. #Optional if a primary entity exists, otherwise Required ``` +You can refer to the [best practices guide](/best-practices/how-we-structure/5-semantic-layer-marts) for more info on project structuring. + The following example displays a complete configuration and detailed descriptions of each field: ```yaml @@ -252,8 +254,6 @@ import MeasuresParameters from '/snippets/_sl-measures-parameters.md'; - - import SetUpPages from '/snippets/_metrics-dependencies.md'; @@ -264,3 +264,4 @@ import SetUpPages from '/snippets/_metrics-dependencies.md'; - [Dimensions](/docs/build/dimensions) - [Entities](/docs/build/entities) - [Measures](/docs/build/measures) +- [Project structure best practices guide](/best-practices/how-we-structure/5-semantic-layer-marts) diff --git a/website/docs/docs/build/sources.md b/website/docs/docs/build/sources.md index 466bcedc688..93757cdfa71 100644 --- a/website/docs/docs/build/sources.md +++ b/website/docs/docs/build/sources.md @@ -91,7 +91,7 @@ You can also: - Add data tests to sources - Add descriptions to sources, that get rendered as part of your documentation site -These should be familiar concepts if you've already added tests and descriptions to your models (if not check out the guides on [testing](/docs/build/data-tests) and [documentation](/docs/collaborate/documentation)). +These should be familiar concepts if you've already added tests and descriptions to your models (if not check out the guides on [testing](/docs/build/data-tests) and [documentation](/docs/build/documentation)). diff --git a/website/docs/docs/build/sql-models.md b/website/docs/docs/build/sql-models.md index 87e063cdcdb..a019508d370 100644 --- a/website/docs/docs/build/sql-models.md +++ b/website/docs/docs/build/sql-models.md @@ -260,7 +260,7 @@ Additionally, the `ref` function encourages you to write modular transformations ## Testing and documenting models -You can also document and test models — skip ahead to the section on [testing](/docs/build/data-tests) and [documentation](/docs/collaborate/documentation) for more information. +You can also document and test models — skip ahead to the section on [testing](/docs/build/data-tests) and [documentation](/docs/build/documentation) for more information. ## Additional FAQs diff --git a/website/docs/docs/build/unit-tests.md b/website/docs/docs/build/unit-tests.md index 709c2b736b4..dcd7e6d282d 100644 --- a/website/docs/docs/build/unit-tests.md +++ b/website/docs/docs/build/unit-tests.md @@ -10,13 +10,13 @@ keywords: :::note -This functionality is only supported in dbt Core v1.8+ or dbt Cloud accounts that have opted to ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). +This functionality is only supported in dbt Core v1.8+ or versionless dbt Cloud accounts that have opted to ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). ::: Historically, dbt's test coverage was confined to [“data” tests](/docs/build/data-tests), assessing the quality of input data or resulting datasets' structure. However, these tests could only be executed _after_ building a model. -With dbt Core v1.8 and dbt Cloud environments that opt to "Keep on latest version," we have introduced an additional type of test to dbt - unit tests. In software programming, unit tests validate small portions of your functional code, and they work much the same way here. Unit tests allow you to validate your SQL modeling logic on a small set of static inputs _before_ you materialize your full model in production. Unit tests enable test-driven development, benefiting developer efficiency and code reliability. +With dbt Core v1.8 and dbt Cloud environments that have gone versionless by opting to "Keep on latest version," we have introduced an additional type of test to dbt - unit tests. In software programming, unit tests validate small portions of your functional code, and they work much the same way here. Unit tests allow you to validate your SQL modeling logic on a small set of static inputs _before_ you materialize your full model in production. Unit tests enable test-driven development, benefiting developer efficiency and code reliability. ## Before you begin diff --git a/website/docs/docs/cloud-integrations/set-up-snowflake-native-app.md b/website/docs/docs/cloud-integrations/set-up-snowflake-native-app.md index 7e9d7c8dc16..4e34c7cc836 100644 --- a/website/docs/docs/cloud-integrations/set-up-snowflake-native-app.md +++ b/website/docs/docs/cloud-integrations/set-up-snowflake-native-app.md @@ -37,7 +37,8 @@ The following are the prerequisites for dbt Cloud and Snowflake. - You have **ACCOUNTADMIN** access in Snowflake. - Your Snowflake account must have access to the Native App/SPCS integration (PrPr until Summit) and NA/SPCS configurations (PuPr at end of June). If you're unsure, please check with your Snowflake account manager. -- The Snowflake account must be in an AWS Region or Azure region. +- The Snowflake account must be in an AWS Region or Azure region. +- You have access to Snowflake Cortex through your Snowflake permissions and [Snowflake Cortex is available in your region](https://docs.snowflake.com/en/user-guide/snowflake-cortex/llm-functions#availability). Without this, Ask dbt will not work. ## Set up the configuration for Ask dbt @@ -50,11 +51,7 @@ Configure dbt Cloud and Snowflake Cortex to power the **Ask dbt** chatbot. -1. Identify the default database the environment is connecting to. - 1. Select **Deploy > Environments** from the top navigation bar. From the environments list, select the one that was identified in the **Semantic Layer Configuration Details** panel. - 1. On the environment's page, click **Settings**. Scroll to the section **Deployment connection**. The listed database is the default for your environment and is also where you will create the schema. Save this information in a temporary location to use later on. - -1. In Snowflake, verify that your SL user has been granted permission to use Snowflake Cortex. This user must have the ability to read and write into this schema to create the Retrieval Augmented Generation (RAG). For more information, refer to [Required Privileges](https://docs.snowflake.com/en/user-guide/snowflake-cortex/llm-functions#required-privileges) in the Snowflake docs. +1. In Snowflake, verify that your SL and deployment user has been granted permission to use Snowflake Cortex. For more information, refer to [Required Privileges](https://docs.snowflake.com/en/user-guide/snowflake-cortex/llm-functions#required-privileges) in the Snowflake docs. By default, all users should have access to Snowflake Cortex. If this is disabled for you, open a Snowflake SQL worksheet and run these statements: @@ -62,18 +59,10 @@ Configure dbt Cloud and Snowflake Cortex to power the **Ask dbt** chatbot. create role cortex_user_role; grant database role SNOWFLAKE.CORTEX_USER to role cortex_user_role; grant role cortex_user_role to user SL_USER; + grant role cortex_user_role to user DEPLOYMENT_USER; ``` - Make sure to replace `SNOWFLAKE.CORTEX_USER` and `SL_USER` with the appropriate strings for your environment. - -1. Create a schema `dbt_sl_llm` in the deployment database. Open a Snowflake SQL worksheet and run these statements: - - ```sql - create schema YOUR_DEPLOYMENT_DATABASE.dbt_sl_llm; - grant ownership on schema dbt_sl_llm to role SL_ROLE; - ``` - - Make sure to replace `YOUR_DEPLOYMENT_DATABASE` and `SL_USER` with the appropriate strings for your environment. + Make sure to replace `SNOWFLAKE.CORTEX_USER`, `DEPLOYMENT_USER`, and `SL_USER` with the appropriate strings for your environment. ## Configure dbt Cloud Collect three pieces of information from dbt Cloud to set up the application. @@ -124,7 +113,7 @@ To verify the app installed successfully, select any of the following from the s - **Explore** — Launch dbt Explorer and make sure you can access your dbt project information. - **Jobs** — Review the run history of the dbt jobs. -- **Ask dbt** — Click on any of the suggested prompts to ask the chatbot a question. Depending on the number of metrics that's defined for the dbt project, it can take several minutes to load **Ask dbt** the first time because dbt is building the RAG. Subsequent launches will load faster. +- **Ask dbt** — Click on any of the suggested prompts to ask the chatbot a question. Depending on the number of metrics that's defined for the dbt project, it can take several minutes to load **Ask dbt** the first time because dbt is building the Retrieval Augmented Generation (RAG). Subsequent launches will load faster. The following is an example of the **Ask dbt** chatbot with the suggested prompts near the top: @@ -140,6 +129,12 @@ The following is an example of the **Ask dbt** chatbot with the suggested prompt ## FAQs + + +The dbt Cloud Snowflake Native App is not available to Snowflake Free Trial accounts. + + + Check that the SL user has been granted access to the `dbt_sl_llm` schema and make sure they have all the necessary permissions to read and write from the schema. diff --git a/website/docs/docs/cloud/cloud-cli-installation.md b/website/docs/docs/cloud/cloud-cli-installation.md index 3d00ef9f728..267d8973195 100644 --- a/website/docs/docs/cloud/cloud-cli-installation.md +++ b/website/docs/docs/cloud/cloud-cli-installation.md @@ -21,7 +21,7 @@ dbt commands are run against dbt Cloud's infrastructure and benefit from: ## Prerequisites The dbt Cloud CLI is available in all [deployment regions](/docs/cloud/about-cloud/access-regions-ip-addresses) and for both multi-tenant and single-tenant accounts. -- You are on dbt version 1.5 or higher. Alternatively, set it to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always use the latest version. +- You are on dbt version 1.5 or higher. Alternatively, set it to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to get a versionless experience and automatically stay up to date. ## Install dbt Cloud CLI diff --git a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md index bc1e14c3250..88e1c821390 100644 --- a/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md +++ b/website/docs/docs/cloud/connect-data-platform/connect-snowflake.md @@ -17,6 +17,10 @@ The following fields are required when creating a Snowflake connection **Note:** A crucial part of working with dbt atop Snowflake is ensuring that users (in development environments) and/or service accounts (in deployment to production environments) have the correct permissions to take actions on Snowflake! Here is documentation of some [example permissions to configure Snowflake access](/reference/database-permissions/snowflake-permissions). +## Authentication methods + +This section describes the different authentication methods available for connecting dbt Cloud to Snowflake. + ### Username / Password **Available in:** Development environments, Deployment environments @@ -27,7 +31,7 @@ username (specifically, the `login_name`) and the corresponding user's Snowflake to authenticate dbt Cloud to run queries against Snowflake on behalf of a Snowflake user. **Note**: The schema field in the **Developer Credentials** section is a required field. - + ### Key pair @@ -65,12 +69,22 @@ The `Keypair` auth method uses Snowflake's [Key Pair Authentication](https://doc The OAuth auth method permits dbt Cloud to run development queries on behalf of a Snowflake user without the configuration of Snowflake password in dbt Cloud. For more information on configuring a Snowflake OAuth connection in dbt Cloud, please see [the docs on setting up Snowflake OAuth](/docs/cloud/manage-access/set-up-snowflake-oauth). - + ## Configuration To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [Snowflake-specific configuration](/reference/resource-configs/snowflake-configs). +### Custom domain URL support + +To connect to Snowflake through a custom domain (vanity URL) instead of the account locator, use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to configure the `host` parameter with the custom domain: + +```yaml +host: https://custom_domain_to_snowflake.com +``` + +This configuration may conflict with Snowflake OAuth when used with PrivateLink. IF users can't reach Snowflake authentication servers from a networking standpoint, please [contact dbt Support](mailto:support@getdbt.com) to find a workaround with this architecture. + ## Troubleshooting diff --git a/website/docs/docs/cloud/dbt-assist.md b/website/docs/docs/cloud/dbt-assist.md index cac5457812a..eafe7d05821 100644 --- a/website/docs/docs/cloud/dbt-assist.md +++ b/website/docs/docs/cloud/dbt-assist.md @@ -8,7 +8,7 @@ pagination_prev: null # About dbt Assist -dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. +dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates [documentation](/docs/build/documentation) and [tests](/docs/build/data-tests) for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. :::tip Beta feature dbt Assist is an AI tool meant to _help_ developers generate documentation and tests in dbt Cloud. It's available in beta, in the dbt Cloud IDE only. diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md index 1e561b379b4..e2fb122cba3 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md @@ -131,7 +131,7 @@ Nice job, you're ready to start developing and building models 🎉! - **Generate your YAML configurations with dbt Assist** — [dbt Assist](/docs/cloud/dbt-assist) is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud. It generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. Available for dbt Cloud Enterprise plans. -- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs#generating-documentation) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. +- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. ## Related docs diff --git a/website/docs/docs/cloud/enable-dbt-assist.md b/website/docs/docs/cloud/enable-dbt-assist.md index ae011351566..69b0013dee6 100644 --- a/website/docs/docs/cloud/enable-dbt-assist.md +++ b/website/docs/docs/cloud/enable-dbt-assist.md @@ -12,7 +12,7 @@ This page explains how to enable dbt Assist in dbt Cloud to leverage AI to speed - Available in the dbt Cloud IDE only. - Must have an active [dbt Cloud Enterprise account](https://www.getdbt.com/pricing). -- Development environment be: [Keep on Latest Version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). +- Development environment be versionless: [Keep on Latest Version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). - Current dbt Assist deployments use a central OpenAI API key managed by dbt Labs. In the future, you may provide your own key for Azure OpenAI or OpenAI. - Accept and sign legal agreements. Reach out to your account team to begin this process. diff --git a/website/docs/docs/cloud/manage-access/auth0-migration.md b/website/docs/docs/cloud/manage-access/auth0-migration.md index 21552b4e0ad..0646263bfff 100644 --- a/website/docs/docs/cloud/manage-access/auth0-migration.md +++ b/website/docs/docs/cloud/manage-access/auth0-migration.md @@ -5,6 +5,14 @@ sidebar: "SSO Auth0 Migration" description: "Required actions for migrating to Auth0 for SSO services on dbt Cloud." --- +:::note + +This migration is a feature of the dbt Cloud Enterprise plan. To learn more about an Enterprise plan, contact us at [sales@getdbt.com](mailto::sales@getdbt.com). + +For single-tenant Virtual Private Cloud, you should [email dbt Cloud Support](mailto::support@getdbt.com) to set up or update your SSO configuration. + +::: + dbt Labs is partnering with Auth0 to bring enhanced features to dbt Cloud's single sign-on (SSO) capabilities. Auth0 is an identity and access management (IAM) platform with advanced security features, and it will be leveraged by dbt Cloud. These changes will require some action from customers with SSO configured in dbt Cloud today, and this guide will outline the necessary changes for each environment. If you have not yet configured SSO in dbt Cloud, refer instead to our setup guides for [SAML](/docs/cloud/manage-access/set-up-sso-saml-2.0), [Okta](/docs/cloud/manage-access/set-up-sso-okta), [Google Workspace](/docs/cloud/manage-access/set-up-sso-google-workspace), or [Microsoft Entra ID (formerly Azure AD)](/docs/cloud/manage-access/set-up-sso-microsoft-entra-id) single sign-on services. diff --git a/website/docs/docs/cloud/secure/environment-permissions-setup.md b/website/docs/docs/cloud/secure/environment-permissions-setup.md new file mode 100644 index 00000000000..01ad3e6aef0 --- /dev/null +++ b/website/docs/docs/cloud/secure/environment-permissions-setup.md @@ -0,0 +1,65 @@ +--- +title: "Set up environment-level permissions" +id: environment-permissions-setup +description: "Set up environment-level permissions to protect your information" +sidebar_label: "Set up environment-level permissions" +pagination_next: null +pagination_prev: null +--- + +# Set up environment-level permissions + +:::note + +This is a beta feature available to select dbt Cloud Enterprise customers. If you are interested in beta testing this feature, please contact your account manager. + +::: + +To set up and configure environment-level permissions, you must have write permissions to the **Groups & Licenses** settings of your dbt Cloud account. For more information about roles and permissions, check out [User permissions and licenses](/docs/cloud/manage-access/seats-and-users). + +Environment-level permissions are not the same as account-level [role-based access control (RBAC)](/docs/cloud/manage-access/about-user-access#role-based-access-control) and are configured separately from those workflows. + +## Setup instructions + +In your dbt Cloud account: + +1. Open the **gear menu** and select **Account settings**. From the left-side menu, select **Groups & Licenses**. While you can edit existing groups, we recommend not altering the default `Everyone`, `Member`, and `Owner` groups. + + + +2. Create a new or open an existing group. If it's a new group, give it a name, then scroll down to **Access & permissions**. Click **Add**. + + + +3. Select the **Permission set** for the group. Only the following permissions sets can have environment-level permissions configured: + +- Database admin +- Git admin +- Team admin +- Analyst +- Developer + +Other permission sets are restricted because they have access to everything (for example, Account admin), or limitations prevent them from having write access to environments (for example, Account viewer). + +If you select a permission set that is not supported, the environment permission option will not appear. + + + +4. Select the **Environment** for group access. The default is **All environments**, but you can select multiple. If none are selected, the group will have read-only access. Note that `Other` maps to the `General` environment type. + + + +5. Save the Group settings. You're now setup and ready to assign users! + +## User experience + +Users with permissions to the environment will see all capabilities assigned to their role. The environment-level permissions are `write` or `read-only` access. This feature does not currently support determining which features in the environment are accessible. For more details on what can and can not be done with environment-level permissions, refer to [About environment-permissions](/docs/cloud/secure/environment-permissions). + +For example, here is an overview of the **Jobs** section of the environment page if a user has been granted access: + + + +The same page if the user has not been granted environment-level permissions: + + + diff --git a/website/docs/docs/cloud/secure/environment-permissions.md b/website/docs/docs/cloud/secure/environment-permissions.md new file mode 100644 index 00000000000..b54a48d04e5 --- /dev/null +++ b/website/docs/docs/cloud/secure/environment-permissions.md @@ -0,0 +1,88 @@ +--- +title: "About environment-level permissions" +id: environment-permissions +description: "About environment-level permissions to protect your information" +sidebar_label: "Environment-level permissions" +pagination_next: null +pagination_prev: null +--- + +# About environment-level permissions + +:::note + +This is a beta feature available to select dbt Cloud Enterprise customers. If you are interested in beta testing this feature, please contact your account manager. + +::: + +Environment-level permissions give dbt Cloud admins the ability to grant write permission to groups and service tokens for specific [environment types](/docs/dbt-cloud-environments) within a project. Granting access to an environment give users access to all environment-level write actions and resources associated with their assigned roles. For example, users with a Developer role can create and run jobs within the environment(s) they have access to. For all other environments, those same users will have read-only access. + +For configuration instructions, check out the [setup page](/docs/cloud/secure/environment-permissions-setup). + +## Current limitations + +Environment-level permissions give dbt Cloud admins more flexibility to protect their environments, but it's important to understand that there are some limitations to this feature, so those admins can make informed decisions about granting access. + +- Environment-level permissions do not allow you to create custom roles and permissions for each resource type in dbt Cloud. +- You can only select environment types, and can’t specify a particular environment within a project. +- You can't select specific resources withing environments. dbt Cloud jobs, runs, and environment variables are all environment resources. + - For example, you can't specify that a user only has access to jobs but not environment variables. Access to a given environment gives the user access to everything within that environment. + +## Environments and roles + +dbt Cloud has four different environment types per project: + +- **Production** — Primary deployment environment. Only one unique Production env per project. +- **Development** — Developer testing environment. Only one unique Development env per project. +- **Staging** — Pre-prod environment that sits between development and staging. Only one unique Staging env per project. +- **General** — Mixed use environments. No limit on the number per project. + +Environment write permissions can be specified for the following roles: + +- Analyst +- Database admin +- Developer (Previous default write access for all environments. The new default is read access for environments unless access is specified) +- Git admin +- Team admin + +Depending on your current group mappings, you may have to update roles to ensure users have the correct access level to environments. + +Determine what personas need updated environment access and the roles they should be mapped to. The personas below highlight a few scenarios for environment permissions: + +- **Developer** — Write access to create/run jobs in development environment +- **Testing/QA** — Write access to staging and development environments to test +- **Production deployment** — Write access to all environments, including production, for deploying +- **Analyst** — Doesn't need environmental write access but read-only access for discovery and troubleshooting +- **Other admins** — These admins may need write access to create/run jobs or configure integrations for any number of environments + +## Projects and environments + +Environment-level permissions can be enforced over one or multiple projects with mixed access to the environments themselves. + +### Single project environments + +If you’re working with a single project, we recommend restricting access to the Production environment and ensuring groups have access to Development, Staging, or General environments where they can safely create and run jobs. The following is an example of how the personas could be mapped to roles: + +- **Developer:** Developer role with write access to Development and General environments +- **Testing/QA:** Developer role with write access to Development, Staging, and General environments +- **Production Deployment:** Developer role with write access to all environments or Job Admin which has access to all environments by default. +- **Analyst:** Analyst role with no write access and read-only access to environments. +- **Other Admins:** Depends on the admin needs. For example, if they are managing the production deployment grant access to all environments. + +### Multiple projects + +Let's say Acme corp has 12 projects and 3 of them belong to Finance, 3 belong to Marketing, 4 belong to Manufacturing, and 2 belong to Technology. + +With mixed access across projects: + +- **Developer:** If the user has the Developer role has access to Projects A, B, C, then they only need access to Dev and General environments. +- **Testing/QA:** If they have the Developer role and they have access to Projects A, B, C, then they only need access to Development, Staging, and General environments. +- **Production Deployment:** If the user has the Admin _or_ Developer role _and_ they have access to Projects A, B, C, then they need access to all Environments. +- **Analyst:** If the user has the Analyst role, then the need _no_ write access to _any environment_. +- **Other Admins:** A user (non-Admin) can have access to multiple projects depending on the requirements. + +If the user has the same roles across projects, you can apply environment access across all projects. + + +## Related docs +-[Environment-level permissions setup](/docs/cloud/secure/environment-permissions-setup) \ No newline at end of file diff --git a/website/docs/docs/collaborate/access-from-dbt-cloud.md b/website/docs/docs/collaborate/access-from-dbt-cloud.md new file mode 100644 index 00000000000..47f7e319ba6 --- /dev/null +++ b/website/docs/docs/collaborate/access-from-dbt-cloud.md @@ -0,0 +1,38 @@ +--- +title: "Access dbt Explorer from dbt Cloud features" +sidebar_label: "Access from dbt Cloud" +description: "Learn where and how to directly access and interact with dbt Explorer from dbt Cloud features and products." +--- + +Access dbt Explorer from other features and products inside dbt Cloud, ensuring you have a seamless experience navigating between resources and lineage in your project. + +This page explains how to access dbt Explorer from various dbt Cloud features, including the dbt Cloud IDE and jobs. While the primary way to navigate to dbt Explorer is through the **Explore** link in the navigation, you can also access it from other dbt Cloud features. + +### dbt Cloud IDE +You can enhance your project navigation and editing experience by directly accessing resources from the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to dbt Explorer for model, seed, or snapshot files. This workflow offers a seamless transition between the IDE and Explorer, allowing you to quickly navigate between viewing project metadata and making updates to your models or other resources without switching contexts. + +#### Access dbt Explorer from the IDE +- In your model, seed, or snapshot file, click the **View in Explorer** icon to the right of your file breadcrumb (under the file name tab). +- This opens the model, seed, or snapshot file in a new tab, allowing you to view resources/lineage directly in Explorer. + + + +### Lineage tab in jobs +The **Lineage tab** in dbt Cloud jobs displays the lineage associated with the [job run](/docs/deploy/jobs). Access dbt Explorer directly from this tab, allowing you understand dependencies/relationships of resources in your project. + +#### Access dbt Explorer from the lineage tab +- From a job, select the **Lineage tab**. +- Double-click the node in the lineage to open a new tab and view its metadata directly in dbt Explorer. + + + +### Model timing tab in jobs +The [model timing tab](/docs/deploy/run-visibility#model-timing) in dbt Cloud jobs displays the composition, order, and time taken by each model in a job run. + +Access dbt Explorer directly from the **modeling timing tab**, which helps you investigate resources, diagnose performance bottlenecks, understand dependencies/relationships of slow-running models, and potentially make changes to improve their performance. + +#### Access dbt Explorer from the model timing tab +- From a job, select the **model timing tab**. +- Hover over a resource and click on **View on Explorer** to view the resource metadata directly in dbt Explorer. + + diff --git a/website/docs/docs/collaborate/build-and-view-your-docs.md b/website/docs/docs/collaborate/build-and-view-your-docs.md new file mode 100644 index 00000000000..ad43795a38c --- /dev/null +++ b/website/docs/docs/collaborate/build-and-view-your-docs.md @@ -0,0 +1,85 @@ +--- +title: "Build and view your docs with dbt Cloud" +id: "build-and-view-your-docs" +description: "Automatically generate project documentation as you run jobs." +pagination_next: null +--- + +dbt Cloud enables you to generate documentation for your project and data platform. The documentation is automatically updated with new information after a fully successful job run, ensuring accuracy and relevance. + +The default documentation experience in dbt Cloud is [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). Use [dbt Explorer](/docs/collaborate/explore-projects) to view your project's resources (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. + +Refer to [documentation](/docs/build/documentation) for more configuration details. + +This shift makes [dbt Docs](#dbt-docs) a legacy documentation feature in dbt Cloud. dbt Docs is still accessible and offers basic documentation, but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. dbt Docs is available to dbt Cloud developer plans or dbt Core users. + +## Set up a documentation job + +dbt Explorer uses the [metadata](/docs/collaborate/explore-projects#generate-metadata) generated after each job run in the production or staging environment, ensuring it always has the latest project results. To view richer metadata, you can set up documentation for a job in dbt Cloud when you edit your job settings or create a new job. + +Configure the job to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) when it runs. If you want to view column and statistics for models, sources, and snapshots in dbt Explorer, then this step is necessary. + +To set up a job to generate docs: + +1. In the top left, click **Deploy** and select **Jobs**. +2. Create a new job or select an existing job and click **Settings**. +3. Under **Execution Settings**, select **Generate docs on run** and click **Save**. + + +*Note, for dbt Docs users you need to configure the job to generate docs when it runs, then manually link that job to your project. Proceed to [configure project documentation](#configure-project-documentation) so your project generates the documentation when this job runs.* + +You can also add the [`dbt docs generate` command](/reference/commands/cmd-docs) to the list of commands in the job run steps. However, you can expect different outcomes when adding the command to the run steps compared to configuring a job selecting the **Generate docs on run** checkbox. + +Review the following options and outcomes: + +| Options | Outcomes | +|--------| ------- | +| **Select checkbox** | Select the **Generate docs on run** checkbox to automatically generate updated project docs each time your job runs. If that particular step in your job fails, the job can still be successful if all subsequent steps are successful. | +| **Add as a run step** | Add `dbt docs generate` to the list of commands in the job run steps, in whatever order you prefer. If that particular step in your job fails, the job will fail and all subsequent steps will be skipped. | + +:::tip Tip — Documentation-only jobs + +To create and schedule documentation-only jobs at the end of your production jobs, add the `dbt compile` command in the **Commands** section. + +::: + +## dbt Docs + +dbt Docs, available on developer plans or dbt Core users, generates a website from your dbt project using the `dbt docs generate` command. It provides a central location to view your project's resources, such as models, tests, and lineage — and helps you understand the data in your warehouse. + +### Configure project documentation + +You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation. + +1. Click the gear icon in the top right. +2. Select **Account Settings**. +3. Navigate to **Projects** and select the project that needs documentation. +4. Click **Edit**. +5. Under **Artifacts**, select the job that should generate docs when it runs and click **Save**. + + +:::tip Use dbt Explorer for a richer documentation experience +For a richer and more interactive experience, try out [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). It includes map layers of your DAG, keyword search, interacts with the IDE, model performance, project recommendations, and more. +::: + +### Generating documentation + +To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the **Command Bar** in the dbt Cloud IDE. This command will generate the documentation for your dbt project as it exists in development in your IDE session. + +After generating your documentation, you can click **Explore** in the navigation. This will take you to dbt Explorer, where you can view your project's resources and their lineage. + + + +After running `dbt docs generate` in the dbt Cloud IDE, click the icon above the file tree, to see the latest version of your documentation rendered in a new browser window. + +### View documentation + +Once you set up a job to generate documentation for your project, you can click **Explore** in the navigation and then click on **dbt Docs**. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud. + +These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation. + +The dbt Cloud IDE makes it possible to view [documentation](/docs/build/documentation) for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. + +## Related docs +- [Documentation](/docs/build/documentation) +- [dbt Explorer](/docs/collaborate/explore-projects) diff --git a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md deleted file mode 100644 index 0129b43f305..00000000000 --- a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md +++ /dev/null @@ -1,68 +0,0 @@ ---- -title: "Build and view your docs with dbt Cloud" -id: "build-and-view-your-docs" -description: "Automatically generate project documentation as you run jobs." -pagination_next: null ---- - -dbt Cloud enables you to generate documentation for your project and data platform, rendering it as a website. The documentation is only updated with new information after a fully successful job run, ensuring accuracy and relevance. Refer to [Documentation](/docs/collaborate/documentation) for more details. - -## Set up a documentation job - -You can set up documentation for a job in dbt Cloud when you edit your job settings or create a new job. You need to configure the job to generate docs when it runs, then link that job to your project. - -To set up a job to generate docs: - -1. In the top left, click **Deploy** and select **Jobs**. -2. Create a new job or select an existing job and click **Settings**. -3. Under "Execution Settings," select **Generate docs on run**. - - -4. Click **Save**. Proceed to [configure project documentation](#configure-project-documentation) so your project generates the documentation when this job runs. - -You can also add `dbt docs generate` to the list of commands in the job run steps. However, you can expect different outcomes when adding the command to the run steps compared to configuring a job selecting the **Generate docs on run** checkbox (shown in previous steps). - -Review the following options and outcomes: - -| Options | Outcomes | -|--------| ------- | -| **Select checkbox** | Select the **Generate docs on run** checkbox to automatically generate updated project docs each time your job runs. If that particular step in your job fails, the job can still be successful if all subsequent steps are successful. | -| **Add as a run step** | Add `dbt docs generate` to the list of commands in the job run steps, in whatever order you prefer. If that particular step in your job fails, the job will fail and all subsequent steps will be skipped. | - -:::tip Tip — Documentation-only jobs - -To create and schedule documentation-only jobs at the end of your production jobs, add the `dbt compile` command in the **Commands** section. - -::: - -## Configure project documentation - -You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation. - -1. Click the gear icon in the top right. -2. Select **Account Settings**. -3. Navigate to **Projects** and select the project that needs documentation. -4. Click **Edit**. -5. Under **Artifacts**, select the job that should generate docs when it runs. - -6. Click **Save**. - -## Generating documentation - -To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the -Command Bar in the dbt Cloud IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session. - - - -After generating your documentation, you can click the **Book** icon above the file tree, to see the latest version of your documentation rendered in a new browser window. - -## Viewing documentation - -Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud. - -These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation. - -The dbt Cloud IDE makes it possible to view [documentation](/docs/collaborate/documentation) -for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. - - diff --git a/website/docs/docs/collaborate/collaborate-with-others.md b/website/docs/docs/collaborate/collaborate-with-others.md index 7875a8044b6..c8c8bd4657f 100644 --- a/website/docs/docs/collaborate/collaborate-with-others.md +++ b/website/docs/docs/collaborate/collaborate-with-others.md @@ -8,7 +8,7 @@ pagination_prev: null
@@ -26,7 +26,7 @@ pagination_prev: null -
\ No newline at end of file + diff --git a/website/docs/docs/collaborate/dbt-explorer-faqs.md b/website/docs/docs/collaborate/dbt-explorer-faqs.md index c4203bdd031..7533aa8ff99 100644 --- a/website/docs/docs/collaborate/dbt-explorer-faqs.md +++ b/website/docs/docs/collaborate/dbt-explorer-faqs.md @@ -171,3 +171,11 @@ Yes, users with read-only access can use the dbt Explorer. Specific feature avai The ability to embed and share views is being evaluated as a potential future capability.
+ + + +Yes, you can [access dbt Explorer from various dbt Cloud features](/docs/collaborate/access-from-dbt-cloud), ensuring you have a seamless experience navigating between resources and lineage in your project. + +While the primary way to access dbt Explorer is through the **Explore** link in the navigation, you can also access it from the [dbt Cloud IDE](/docs/collaborate/access-from-dbt-cloud#dbt-cloud-ide), [the lineage tab in jobs](/docs/collaborate/access-from-dbt-cloud#lineage-tab-in-jobs), and the [model timing tab in jobs](/docs/collaborate/access-from-dbt-cloud#model-timing-tab-in-jobs). + + diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md index dbd27e1835d..aa549520f34 100644 --- a/website/docs/docs/collaborate/explore-projects.md +++ b/website/docs/docs/collaborate/explore-projects.md @@ -1,8 +1,8 @@ --- -title: "Explore your dbt projects" -sidebar_label: "Explore dbt projects" -description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your data pipelines." -pagination_next: "docs/collaborate/model-performance" +title: "Discover data with dbt Explorer" +sidebar_label: "Discover data with dbt Explorer" +description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your dbt projects." +pagination_next: "docs/collaborate/column-level-lineage" pagination_prev: null --- @@ -12,28 +12,30 @@ With dbt Explorer, you can view your project's [resources](/docs/build/projects) - You have a dbt Cloud account on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). - You have set up a [production](/docs/deploy/deploy-environments#set-as-production-environment) or [staging](/docs/deploy/deploy-environments#create-a-staging-environment) deployment environment for each project you want to explore. - - There has been at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer. -- You are on the dbt Explorer page. To do this, select **Explore** from the top navigation bar in dbt Cloud. +- You have at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer. +- You are on the dbt Explorer page. To do this, select **Explore** from the navigation in dbt Cloud. + -## Generate metadata +## Generate metadata -dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ or _staging_ in your dbt Cloud project. dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project. +dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ or _staging_ in your dbt Cloud project. -Note that CI jobs do not update dbt Explorer. This is because they don't reflect the production state and don't provide the necessary metadata updates. - -To view a resource and its metadata, you must define the resource in your project and run a job in the production or staging environment. The resulting metadata depends on the [commands](/docs/deploy/job-commands) executed by the jobs. +- dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project. This includes deploy and merge jobs. +- Note that CI jobs do not update dbt Explorer. This is because they don't reflect the production state and don't provide the necessary metadata updates. +- To view a resource and its metadata, you must define the resource in your project and run a job in the production or staging environment. +- The resulting metadata depends on the [commands](/docs/deploy/job-commands) executed by the jobs. | To view in Explorer | You must successfully run | |---------------------|---------------------------| | Model lineage, details, or results | [dbt run](/reference/commands/run) or [dbt build](/reference/commands/build) on a given model within a job in the environment | -| Columns and statistics for models, sources, and snapshots| [dbt docs generate](/reference/commands/cmd-docs) within a job in the environment | +| Columns and statistics for models, sources, and snapshots| [dbt docs generate](/reference/commands/cmd-docs) within [a job](/docs/collaborate/build-and-view-your-docs) in the environment | | Test results | [dbt test](/reference/commands/test) or [dbt build](/reference/commands/build) within a job in the environment | | Source freshness results | [dbt source freshness](/reference/commands/source#dbt-source-freshness) within a job in the environment | | Snapshot details | [dbt snapshot](/reference/commands/snapshot) or [dbt build](/reference/commands/build) within a job in the environment | | Seed details | [dbt seed](/reference/commands/seed) or [dbt build](/reference/commands/build) within a job in the environment | -Richer and more timely metadata will become available as dbt Cloud evolves. +Richer and more timely metadata will become available as dbt Cloud evolves. ## Explore your project's lineage graph {#project-lineage} @@ -52,7 +54,7 @@ To explore the lineage graphs of tests and macros, view [their resource details - Hover over any item in the graph to display the resource’s name and type. - Zoom in and out on the graph by mouse-scrolling. - Grab and move the graph and the nodes. -- Right click on a node (context menu) to: +- Right-click on a node (context menu) to: - Refocus on the node, including its upstream and downstream nodes - Refocus on the node and its downstream nodes only - Refocus on the node and it upstream nodes only @@ -184,7 +186,7 @@ In the upper right corner of the resource details page, you can: - **Status bar** (below the page title) — Information on the last time the model ran, whether the run was successful, how the data is materialized, number of rows, and the size of the model. - **General** tab includes: - **Lineage** graph — The model’s lineage graph that you can interact with. The graph includes one upstream node and one downstream node from the model. Click the Expand icon in the graph's upper right corner to view the model in full lineage graph mode. - - **Description** section — A [description of the model](/docs/collaborate/documentation#adding-descriptions-to-your-project). + - **Description** section — A [description of the model](/docs/build/documentation#adding-descriptions-to-your-project). - **Recent** section — Information on the last time the model ran, how long it ran for, whether the run was successful, the job ID, and the run ID. - **Tests** section — [Tests](/docs/build/data-tests) for the model, including a status indicator for the latest test status. A :white_check_mark: denotes a passing test. - **Details** section — Key properties like the model’s relation name (for example, how it’s represented and how you can query it in the data platform: `database.schema.identifier`); model governance attributes like access, group, and if contracted; and more. @@ -253,6 +255,7 @@ You can explore the metadata from your production or staging environment to info + ## Related content - [Enterprise permissions](/docs/cloud/manage-access/enterprise-permissions) - [About model governance](/docs/collaborate/govern/about-model-governance) diff --git a/website/docs/docs/collaborate/govern/project-dependencies.md b/website/docs/docs/collaborate/govern/project-dependencies.md index f052db29091..83a2b966ee1 100644 --- a/website/docs/docs/collaborate/govern/project-dependencies.md +++ b/website/docs/docs/collaborate/govern/project-dependencies.md @@ -30,7 +30,7 @@ Refer to the [FAQs](#faqs) for more info. ## Prerequisites In order to add project dependencies and resolve cross-project `ref`, you must: -- Use a supported version of dbt (v1.6, v1.7, or "Keep on latest version") for both the upstream ("producer") project and the downstream ("consumer") project. +- Use a supported version of dbt (v1.6, v1.7, or go versionless with "Keep on latest version") for both the upstream ("producer") project and the downstream ("consumer") project. - Define models in an upstream ("producer") project that are configured with [`access: public`](/reference/resource-configs/access). You need at least one successful job run after defining their `access`. - Define a deployment environment in the upstream ("producer") project [that is set to be your Production environment](/docs/deploy/deploy-environments#set-as-production-environment), and ensure it has at least one successful job run in that environment. - Each project `name` must be unique in your dbt Cloud account. For example, if you have a dbt project (codebase) for the `jaffle_marketing` team, you should not create separate projects for `Jaffle Marketing - Dev` and `Jaffle Marketing - Prod`. That isolation should instead be handled at the environment level. To that end, we are working on adding support for environment-level permissions and data warehouse connections; reach out to your dbt Labs account team for beta access in May/June 2024. diff --git a/website/docs/docs/dbt-cloud-apis/discovery-api.md b/website/docs/docs/dbt-cloud-apis/discovery-api.md index 438cf431060..0345c647dd9 100644 --- a/website/docs/docs/dbt-cloud-apis/discovery-api.md +++ b/website/docs/docs/dbt-cloud-apis/discovery-api.md @@ -50,7 +50,8 @@ Use the API to find and understand dbt assets in integrated tools using informat Data producers must manage and organize data for stakeholders, while data consumers need to quickly and confidently analyze data on a large scale to make informed decisions that improve business outcomes and reduce organizational overhead. The API is useful for discovery data experiences in catalogs, analytics, apps, and machine learning (ML) tools. It can help you understand the origin and meaning of datasets for your analysis. - + + diff --git a/website/docs/docs/dbt-cloud-apis/project-state.md b/website/docs/docs/dbt-cloud-apis/project-state.md index b424c6c29c8..71b367cb5ad 100644 --- a/website/docs/docs/dbt-cloud-apis/project-state.md +++ b/website/docs/docs/dbt-cloud-apis/project-state.md @@ -59,18 +59,18 @@ Most Discovery API use cases will favor the _applied state_ since it pertains to ## Affected states by node type -| Node | Executed in DAG | Created by execution | Exists in database | Lineage | States | -|-----------|------------------|----------------------|--------------------|-----------------------|----------------------| -| Model | Yes | Yes | Yes | Upstream & downstream | Applied & definition | -| Source | Yes | No | Yes | Downstream | Applied & definition | -| Seed | Yes | Yes | Yes | Downstream | Applied & definition | -| Snapshot | Yes | Yes | Yes | Upstream & downstream | Applied & definition | -| Test | Yes | Yes | No | Upstream | Applied & definition | -| Exposure | No | No | No | Upstream | Definition | -| Metric | No | No | No | Upstream & downstream | Definition | -| Semantic model | No | No | No | Upstream & downstream | Definition | -| Group | No | No | No | Downstream | Definition | -| Macro | Yes | No | No | N/A | Definition | +| Node | Executed in DAG | Created by execution | Exists in database | Lineage | States | +|-----------------------------------------------|------------------|----------------------|--------------------|-----------------------|----------------------| +| [Model](/docs/build/models) | Yes | Yes | Yes | Upstream & downstream | Applied & definition | +| [Source](/docs/build/sources) | Yes | No | Yes | Downstream | Applied & definition | +| [Seed](/docs/build/seeds) | Yes | Yes | Yes | Downstream | Applied & definition | +| [Snapshot](/docs/build/snapshots) | Yes | Yes | Yes | Upstream & downstream | Applied & definition | +| [Data test](/docs/build/data-tests) | Yes | Yes | No | Upstream | Applied & definition | +| [Exposure](/docs/build/exposures) | No | No | No | Upstream | Definition | +| [Metric](/docs/build/metrics-overview) | No | No | No | Upstream & downstream | Definition | +| [Semantic model](/docs/build/semantic-models) | No | No | No | Upstream & downstream | Definition | +| [Group](/docs/build/groups) | No | No | No | Downstream | Definition | +| [Macro](/docs/build/jinja-macros) | Yes | No | No | N/A | Definition | ## Caveats about state/metadata updates diff --git a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.8.md b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.8.md index 5f7ee79f562..a6a39da45ca 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.8.md +++ b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.8.md @@ -17,9 +17,7 @@ dbt Labs is committed to providing backward compatibility for all versions 1.x, ## Keep on latest version -dbt Cloud is going "versionless." This means you'll automatically get early access to new features and functionality before they're available in final releases of dbt Core. - -Select ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) in your development, staging, and production [environments](/docs/deploy/deploy-environments) to access to everything in dbt Core v1.8 and more. +With dbt Cloud, you can get early access to many new features and functionality before they're in the Generally Available (GA) release of dbt Core v1.8 without the need to manage version upgrades. Refer to the [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) setting for more details. ## New and changed features and functionality diff --git a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.5.md b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.5.md index 7f78386ff7e..6139cdcfc6f 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.5.md +++ b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.5.md @@ -24,7 +24,7 @@ dbt Labs is committed to providing backward compatibility for all versions 1.x, :::info Why changes to previous behavior? -This release includes significant new features, and rework to `dbt-core`'s CLI and initialization flow. As part of refactoring its internals, we made a handful of changes to runtime configuration. The net result of these changes is more consistent & practical configuration options, and a more legible codebase. +This release includes significant new features, and rework to `dbt-core`'s CLI and initialization flow. As part of refactoring its internals from [`argparse`](https://docs.python.org/3/library/argparse.html) to [`click`](https://click.palletsprojects.com), we made a handful of changes to runtime configuration. The net result of these changes is more consistent and practical configuration options, and a more legible codebase. **_Wherever possible, we will provide backward compatibility and deprecation warnings for at least one minor version before actually removing the old functionality._** In those cases, we still reserve the right to fully remove backwards compatibility for deprecated functionality in a future v1.x minor version of `dbt-core`. diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md index f14fd03a534..38cc7c69b6a 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md @@ -69,7 +69,7 @@ can override schema test definitions - [`full_refresh` config](/reference/resource-configs/full_refresh) **Docs** -- [project-level overviews](/docs/collaborate/documentation#custom-project-level-overviews) +- [project-level overviews](/docs/build/documentation#custom-project-level-overviews) **Redshift** - [`iam_profile`](/docs/core/connect-data-platform/redshift-setup#specifying-an-iam-profile) diff --git a/website/docs/docs/dbt-versions/core-versions.md b/website/docs/docs/dbt-versions/core-versions.md index 65d63fa9196..f94a4c0cdf3 100644 --- a/website/docs/docs/dbt-versions/core-versions.md +++ b/website/docs/docs/dbt-versions/core-versions.md @@ -8,9 +8,9 @@ pagination_prev: null dbt Core releases follow [semantic versioning](https://semver.org/) guidelines. For more on how we use semantic versions, see [How dbt Core uses semantic versioning](#how-dbt-core-uses-semantic-versioning). -:::tip Keep on latest version, always +:::tip Go versionless and stay up to date, always -_Did you know that you can always be working on the latest version?_ +_Did you know that you can always be working with the latest features and functionality?_ With dbt Cloud, you can get early access to new functionality before it becomes available in dbt Core and without the need of managing your own version upgrades. Refer to the [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) setting for details. diff --git a/website/docs/docs/dbt-versions/product-lifecycles.md b/website/docs/docs/dbt-versions/product-lifecycles.md index b45d258ba55..313be55228a 100644 --- a/website/docs/docs/dbt-versions/product-lifecycles.md +++ b/website/docs/docs/dbt-versions/product-lifecycles.md @@ -17,7 +17,7 @@ dbt Cloud features all fall into one of the following categories: - **Beta:** Beta features are still in development and are only available to select customers. To join a beta, there might be a signup form or dbt Labs may contact specific customers about testing. Some features can be activated by enabling [experimental features](/docs/dbt-versions/experimental-features) in your account. Beta features are incomplete and might not be entirely stable; they should be used at the customer’s risk, as breaking changes could occur. Beta features might not be fully documented, technical support is limited, and service level objectives (SLOs) might not be provided. Download the [Beta Features Terms and Conditions](/assets/beta-tc.pdf) for more details. - **Preview:** Preview features are stable and considered functionally ready for production deployments. Some planned additions and modifications to feature behaviors could occur before they become generally available. New functionality that is not backward compatible could also be introduced. Preview features include documentation, technical support, and service level objectives (SLOs). Features in preview are provided at no extra cost, although they might become paid features when they become generally available. -- **Generally available (GA):** Generally available features provide stable features introduced to all qualified dbt Cloud accounts. Service level agreements (SLAs) apply to GA features, including documentation and technical support. Certain GA feature availability is determined by the dbt version of the environment. To always receive the latest GA features, ensure your dbt Cloud [environments](/docs/dbt-cloud-environments) are set to ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). +- **Generally available (GA):** Generally available features provide stable features introduced to all qualified dbt Cloud accounts. Service level agreements (SLAs) apply to GA features, including documentation and technical support. Certain GA feature availability is determined by the dbt version of the environment. To always receive the latest GA features, go versionless and ensure your dbt Cloud [environments](/docs/dbt-cloud-environments) are set to ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). - **Deprecated:** Features in this state are no longer being developed or enhanced by dbt Labs. They will continue functioning as-is, and their documentation will persist until their removal date. However, they are no longer subject to technical support. - **Removed:** Removed features are no longer available on the platform in any capacity. diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md index a0e42c6e068..8621a07831a 100644 --- a/website/docs/docs/dbt-versions/release-notes.md +++ b/website/docs/docs/dbt-versions/release-notes.md @@ -95,7 +95,7 @@ The following features are new or enhanced as part of our [dbt Cloud Launch Show The **Keep on latest version** setting is now Generally Available (previously Public Preview). - When the new **Keep on latest version** setting is enabled, you always get the latest fixes and early access to new functionality for your dbt project. dbt Labs will handle upgrades behind-the-scenes, as part of testing and redeploying the dbt Cloud application — just like other dbt Cloud capabilities and other SaaS tools that you're using. No more manual upgrades and no more need for _a second sandbox project_ just to try out new features in development. + When the new **Keep on latest version** setting is enabled, you get a versionless experience and always get the latest features and early access to new functionality for your dbt project. dbt Labs will handle upgrades behind-the-scenes, as part of testing and redeploying the dbt Cloud application — just like other dbt Cloud capabilities and other SaaS tools that you're using. No more manual upgrades and no more need for _a second sandbox project_ just to try out new features in development. To learn more about the new setting, refer to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) for details. diff --git a/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md b/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md index 6c070da1b9c..81d110fae41 100644 --- a/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md +++ b/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md @@ -7,15 +7,15 @@ In dbt Cloud, both [jobs](/docs/deploy/jobs) and [environments](/docs/dbt-cloud- ## Environments -Navigate to the settings page of an environment, then click **Edit**. Click the **dbt version** dropdown bar and make your selection. You can select a previous release of dbt Core or [Keep on latest version](#keep-on-latest-version). Be sure to save your changes before navigating away. +Navigate to the settings page of an environment, then click **Edit**. Click the **dbt version** dropdown bar and make your selection. You can select a previous release of dbt Core or go versionless by selecting [**Keep on latest version**](#keep-on-latest-version)(recommended). Be sure to save your changes before navigating away. ### Keep on latest version -By choosing to **Keep on latest version**, you always get the latest fixes and early access to new functionality for your dbt project. dbt Labs will handle upgrades for you, as part of testing and redeploying the dbt Cloud SaaS application. Keep on latest version always includes the most recent version of dbt Core, and more. +By choosing to **Keep on latest version**, you opt for a versionless experience that provides the latest features and early access to new functionality for your dbt project. dbt Labs will handle upgrades for you, as part of testing and redeploying the dbt Cloud SaaS application. Keep on latest version always includes the most recent version of dbt Core, and more. -You can upgrade to **Keep on latest version** no matter which version of dbt you currently have selected. As a best practice, dbt Labs recommends that you test the upgrade in development first; use the [Override dbt version](#override-dbt-version) setting to test _your_ project on the latest dbt version before upgrading your deployment environments and the default development environment for all your colleagues. +You can upgrade to **Keep on latest version** and the versionless experience no matter which version of dbt you currently have selected. As a best practice, dbt Labs recommends that you test the upgrade in development first; use the [Override dbt version](#override-dbt-version) setting to test _your_ project on the latest dbt version before upgrading your deployment environments and the default development environment for all your colleagues. ### Override dbt version diff --git a/website/docs/docs/deploy/artifacts.md b/website/docs/docs/deploy/artifacts.md index 9b3ae71e79c..cff36bfafba 100644 --- a/website/docs/docs/deploy/artifacts.md +++ b/website/docs/docs/deploy/artifacts.md @@ -4,13 +4,23 @@ id: "artifacts" description: "Use artifacts to power your automated docs site and source freshness data." --- -When running dbt jobs, dbt Cloud generates and saves *artifacts*. You can use these artifacts, like `manifest.json`, `catalog.json`, and `sources.json` to power different aspects of dbt Cloud, namely: [dbt Docs](/docs/collaborate/documentation) and [source freshness reporting](/docs/build/sources#snapshotting-source-data-freshness). +When running dbt jobs, dbt Cloud generates and saves *artifacts*. You can use these artifacts, like `manifest.json`, `catalog.json`, and `sources.json` to power different aspects of dbt Cloud, namely: [dbt Explorer](/docs/collaborate/explore-projects), [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), and [source freshness reporting](/docs/build/sources#snapshotting-source-data-freshness). ## Create dbt Cloud Artifacts -While running any job can produce artifacts, you should only associate one production job with a given project to produce the project's artifacts. You can designate this connection in the **Project details** page. To access this page, click the gear icon in the upper right, select **Account Settings**, select your project, and click **Edit** in the lower right. Under **Artifacts**, select the jobs you want to produce documentation and source freshness artifacts for. +[dbt Explorer](/docs/collaborate/explore-projects#generate-metadata) uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). It uses metadata from your staging and production [deployment environments](/docs/deploy/deploy-environments) (development environment metadata is coming soon). - +dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project — meaning it's always automatically updated after each job run. + +To view a resource, its metadata, and what commands are needed, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details. + + + +The following steps are for legacy dbt Docs only. For the current documentation experience, see [dbt Explorer](/docs/collaborate/explore-projects). + +While running any job can produce artifacts, you should only associate one production job with a given project to produce the project's artifacts. You can designate this connection on the **Project details** page. To access this page, click the gear icon in the upper right, select **Account Settings**, select your project, and click **Edit** in the lower right. Under **Artifacts**, select the jobs you want to produce documentation and source freshness artifacts for. + + If you don't see your job listed, you might need to edit the job and select **Run source freshness** and **Generate docs on run**. @@ -18,17 +28,30 @@ If you don't see your job listed, you might need to edit the job and select **Ru When you add a production job to a project, dbt Cloud updates the content and provides links to the production documentation and source freshness artifacts it generated for that project. You can see these links by clicking **Deploy** in the upper left, selecting **Jobs**, and then selecting the production job. From the job page, you can select a specific run to see how artifacts were updated for that run only. + + ### Documentation -When set up, dbt Cloud updates the **Documentation** link in the header tab so it links to documentation for this job. This link always directs you to the latest version of the documentation for your project. +Navigate to [dbt Explorer](/docs/collaborate/explore-projects) through the **Explore** link to view your project's resources and lineage to gain a better understanding of its latest production state. -Note that both the job's commands and the docs generate step (triggered by the **Generate docs on run** checkbox) must succeed during the job invocation for the project-level documentation to be populated or updated. +To view a resource, its metadata, and what commands are needed, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details. +Both the job's commands and the docs generate step (triggered by the **Generate docs on run** checkbox) must succeed during the job invocation to update the documentation. - + + +When set up, dbt Cloud updates the Documentation link in the header tab so it links to documentation for this job. This link always directs you to the latest version of the documentation for your project. + + ### Source Freshness -As with Documentation, configuring a job for the Source Freshness artifact setting also updates the Data Sources link under **Deploy**. The new link points to the latest Source Freshness report for the selected job. +To view the latest source freshness result, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more detail. Then navigate to dbt Explorer through the **Explore** link. + + + +Configuring a job for the Source Freshness artifact setting also updates the data source link under **Deploy**. The new link points to the latest Source Freshness report for the selected job. + + diff --git a/website/docs/docs/deploy/deploy-environments.md b/website/docs/docs/deploy/deploy-environments.md index 8403fd93d91..8e25803cced 100644 --- a/website/docs/docs/deploy/deploy-environments.md +++ b/website/docs/docs/deploy/deploy-environments.md @@ -110,20 +110,20 @@ We recommend that the data warehouse credentials be for a dedicated user or serv This section determines the exact location in your warehouse dbt should target when building warehouse objects! This section will look a bit different depending on your warehouse provider. -For all warehouses, use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override missing or inactive (grayed-out) settings. +For all warehouses, use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override missing or inactive (grayed-out) settings.
-This section will not appear if you are using Postgres, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values. +This section will not appear if you are using Postgres, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
-This section will not appear if you are using Redshift, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values. +This section will not appear if you are using Redshift, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
@@ -141,13 +141,13 @@ This section will not appear if you are using Redshift, as all values are inferr
-This section will not appear if you are using Bigquery, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values. +This section will not appear if you are using Bigquery, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
-This section will not appear if you are using Spark, as all values are inferred from the project's connection. Use [extended attributes](/docs/deploy/deploy-environments#extended-attributes) to override these values. +This section will not appear if you are using Spark, as all values are inferred from the project's connection. Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override these values.
@@ -168,6 +168,8 @@ This section will not appear if you are using Spark, as all values are inferred This section allows you to determine the credentials that should be used when connecting to your warehouse. The authentication methods may differ depending on the warehouse and dbt Cloud tier you are on. +For all warehouses, use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override missing or inactive (grayed-out) settings. For credentials, we recommend wrapping extended attributes in [environment variables](/docs/build/environment-variables) (`password: '{{ env_var(''DBT_ENV_SECRET_PASSWORD'') }}'`) to avoid displaying the secret value in the text box and the logs. +
@@ -221,6 +223,8 @@ This section allows you to determine the credentials that should be used when co - **Dataset**: Target dataset +Use [extended attributes](/docs/dbt-cloud-environments#extended-attributes) to override missing or inactive (grayed-out) settings. For credentials, we recommend wrapping extended attributes in [environment variables](/docs/build/environment-variables) (`password: '{{ env_var(''DBT_ENV_SECRET_PASSWORD'') }}'`) to avoid displaying the secret value in the text box and the logs. +
diff --git a/website/docs/docs/deploy/deployment-tools.md b/website/docs/docs/deploy/deployment-tools.md index cca2368f38a..81c798b7d8c 100644 --- a/website/docs/docs/deploy/deployment-tools.md +++ b/website/docs/docs/deploy/deployment-tools.md @@ -5,13 +5,13 @@ sidebar_label: "Integrate with other tools" pagination_next: null --- -Alongside [dbt Cloud](/docs/deploy/jobs), discover other ways to schedule and run your dbt jobs with the help of tools such as Airflow, Prefect, Dagster, automation server, Cron, and Azure Data Factory (ADF), +Alongside [dbt Cloud](/docs/deploy/jobs), discover other ways to schedule and run your dbt jobs with the help of tools such as the ones described on this page. Build and install these tools to automate your data workflows, trigger dbt jobs (including those hosted on dbt Cloud), and enjoy a hassle-free experience, saving time and increasing efficiency. ## Airflow -If your organization is using [Airflow](https://airflow.apache.org/), there are a number of ways you can run your dbt jobs, including: +If your organization uses [Airflow](https://airflow.apache.org/), there are a number of ways you can run your dbt jobs, including: @@ -33,9 +33,13 @@ Invoking dbt Core jobs through the [BashOperator](https://registry.astronomer.io For more details on both of these methods, including example implementations, check out [this guide](https://docs.astronomer.io/learn/airflow-dbt-cloud). +## Automation servers + +Automation servers (such as CodeDeploy, GitLab CI/CD ([video](https://youtu.be/-XBIIY2pFpc?t=1301)), Bamboo and Jenkins) can be used to schedule bash commands for dbt. They also provide a UI to view logging to the command line, and integrate with your git repository. + ## Azure Data Factory -Integrate dbt Cloud and [Azure Data Factory](https://learn.microsoft.com/en-us/azure/data-factory/) (ADF) for a smooth data process, from data ingestion to data transformation. You can seamlessly trigger dbt Cloud jobs upon completion of ingestion jobs by using the [dbt API](/docs/dbt-cloud-apis/overview) in ADF. Need help building this out? [Contact us](https://www.getdbt.com/contact/) today! +Integrate dbt Cloud and [Azure Data Factory](https://learn.microsoft.com/en-us/azure/data-factory/) (ADF) for a smooth data process from data ingestion to data transformation. You can seamlessly trigger dbt Cloud jobs upon completion of ingestion jobs by using the [dbt API](/docs/dbt-cloud-apis/overview) in ADF. The following video provides you with a detailed overview of how to trigger a dbt Cloud job via the API in Azure Data Factory. @@ -53,10 +57,42 @@ To use the dbt API to trigger a job in dbt Cloud through ADF: 5. Trigger the pipeline in ADF to start the dbt Cloud job and monitor the status of the dbt Cloud job in ADF. 6. In dbt Cloud, you can check the status of the job and how it was triggered in dbt Cloud. +## Cron + +Cron is a decent way to schedule bash commands. However, while it may seem like an easy route to schedule a job, writing code to take care of all of the additional features associated with a production deployment often makes this route more complex compared to other options listed here. + +## Dagster + +If your organization uses [Dagster](https://dagster.io/), you can use the [dagster_dbt](https://docs.dagster.io/_apidocs/libraries/dagster-dbt) library to integrate dbt commands into your pipelines. This library supports the execution of dbt through dbt Cloud or dbt Core. Running dbt from Dagster automatically aggregates metadata about your dbt runs. Refer to the [example pipeline](https://dagster.io/blog/dagster-dbt) for details. + +## Databricks workflows + +Use Databricks workflows to call the dbt Cloud job API, which has several benefits such as integration with other ETL processes, utilizing dbt Cloud job features, separation of concerns, and custom job triggering based on custom conditions or logic. These advantages lead to more modularity, efficient debugging, and flexibility in scheduling dbt Cloud jobs. + +For more info, refer to the guide on [Databricks workflows and dbt Cloud jobs](/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs). + +## Kestra + +If your organization uses [Kestra](http://kestra.io/), you can leverage the [dbt plugin](https://kestra.io/plugins/plugin-dbt) to orchestrate dbt Cloud and dbt Core jobs. Kestra's user interface (UI) has built-in [Blueprints](https://kestra.io/docs/user-interface-guide/blueprints), providing ready-to-use workflows. Navigate to the Blueprints page in the left navigation menu and [select the dbt tag](https://demo.kestra.io/ui/blueprints/community?selectedTag=36) to find several examples of scheduling dbt Core commands and dbt Cloud jobs as part of your data pipelines. After each scheduled or ad-hoc workflow execution, the Outputs tab in the Kestra UI allows you to download and preview all dbt build artifacts. The Gantt and Topology view additionally render the metadata to visualize dependencies and runtimes of your dbt models and tests. The dbt Cloud task provides convenient links to easily navigate between Kestra and dbt Cloud UI. + +## Orchestra + +If your organization uses [Orchestra](https://getorchestra.io), you can trigger dbt jobs using the dbt Cloud API. Create an API token from your dbt Cloud account and use this to authenticate Orchestra in the [Orchestra Portal](https://app.getorchestra.io). For details, refer to the [Orchestra docs on dbt Cloud](https://orchestra-1.gitbook.io/orchestra-portal/integrations/transformation/dbt-cloud). + +Orchestra automatically collects metadata from your runs so you can view your dbt jobs in the context of the rest of your data stack. + +The following is an example of the run details in dbt Cloud for a job triggered by Orchestra: + + + +The following is an example of viewing lineage in Orchestra for dbt jobs: + + + ## Prefect -If your organization is using [Prefect](https://www.prefect.io/), the way you will run your jobs depends on the dbt version you're on, and whether you're orchestrating dbt Cloud or dbt Core jobs. Refer to the following variety of options: +If your organization uses [Prefect](https://www.prefect.io/), the way you will run your jobs depends on the dbt version you're on, and whether you're orchestrating dbt Cloud or dbt Core jobs. Refer to the following variety of options: @@ -106,30 +142,6 @@ If your organization is using [Prefect](https://www.prefect.io/), the way you wi -## Dagster - -If your organization is using [Dagster](https://dagster.io/), you can use the [dagster_dbt](https://docs.dagster.io/_apidocs/libraries/dagster-dbt) library to integrate dbt commands into your pipelines. This library supports the execution of dbt through dbt Cloud, dbt Core, and the dbt RPC server. Running dbt from Dagster automatically aggregates metadata about your dbt runs. Refer to the [example pipeline](https://dagster.io/blog/dagster-dbt) for details. - -## Kestra - -If your organization uses [Kestra](http://kestra.io/), you can leverage the [dbt plugin](https://kestra.io/plugins/plugin-dbt) to orchestrate dbt Cloud and dbt Core jobs. Kestra's user interface (UI) has built-in [Blueprints](https://kestra.io/docs/user-interface-guide/blueprints), providing ready-to-use workflows. Navigate to the Blueprints page in the left navigation menu and [select the dbt tag](https://demo.kestra.io/ui/blueprints/community?selectedTag=36) to find several examples of scheduling dbt Core commands and dbt Cloud jobs as part of your data pipelines. After each scheduled or ad-hoc workflow execution, the Outputs tab in the Kestra UI allows you to download and preview all dbt build artifacts. The Gantt and Topology view additionally render the metadata to visualize dependencies and runtimes of your dbt models and tests. The dbt Cloud task provides convenient links to easily navigate between Kestra and dbt Cloud UI. - -## Automation servers - -Automation servers, like CodeDeploy, GitLab CI/CD ([video](https://youtu.be/-XBIIY2pFpc?t=1301)), Bamboo and Jenkins, can be used to schedule bash commands for dbt. They also provide a UI to view logging to the command line, and integrate with your git repository. - -## Cron - -Cron is a decent way to schedule bash commands. However, while it may seem like an easy route to schedule a job, writing code to take care of all of the additional features associated with a production deployment often makes this route more complex compared to other options listed here. - -## Databricks workflows - -Use Databricks workflows to call the dbt Cloud job API, which has several benefits such as integration with other ETL processes, utilizing dbt Cloud job features, separation of concerns, and custom job triggering based on custom conditions or logic. These advantages lead to more modularity, efficient debugging, and flexibility in scheduling dbt Cloud jobs. - -For more info, refer to the guide on [Databricks workflows and dbt Cloud jobs](/guides/how-to-use-databricks-workflows-to-run-dbt-cloud-jobs). - - - ## Related docs - [dbt Cloud plans and pricing](https://www.getdbt.com/pricing/) diff --git a/website/docs/docs/deploy/job-commands.md b/website/docs/docs/deploy/job-commands.md index 26fe1931db6..8117178b2d6 100644 --- a/website/docs/docs/deploy/job-commands.md +++ b/website/docs/docs/deploy/job-commands.md @@ -35,7 +35,7 @@ Every job invocation automatically includes the [`dbt deps`](/reference/commands For every job, you have the option to select the [Generate docs on run](/docs/collaborate/build-and-view-your-docs) or [Run source freshness](/docs/deploy/source-freshness) checkboxes, enabling you to run the commands automatically. -**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Build and view your docs](/docs/collaborate/build-and-view-your-docs) for more info. +**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Set up documentation job](/docs/collaborate/build-and-view-your-docs) for more info. **Job outcome Source freshness checkbox** — dbt Cloud executes the `dbt source freshness` command as the first run step in your job. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Source freshness](/docs/deploy/source-freshness) for more info. diff --git a/website/docs/docs/deploy/source-freshness.md b/website/docs/docs/deploy/source-freshness.md index ab267b6d067..a409c01f82c 100644 --- a/website/docs/docs/deploy/source-freshness.md +++ b/website/docs/docs/deploy/source-freshness.md @@ -12,7 +12,7 @@ dbt Cloud provides a helpful interface around dbt's [source data freshness](/doc [`dbt build`](reference/commands/build) does _not_ include source freshness checks when building and testing resources in your DAG. Instead, you can use one of these common patterns for defining jobs: - Add `dbt build` to the run step to run models, tests, and so on. -- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs#set-up-a-documentation-job). +- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs). - Select the **Run source freshness** checkbox to enable [source freshness](#checkbox) as the first step of the job. @@ -42,4 +42,4 @@ It's important that your freshness jobs run frequently enough to snapshot data l ## Further reading - Refer to [Artifacts](/docs/deploy/artifacts) for more info on how to create dbt Cloud artifacts, share links to the latest documentation, and share source freshness reports with your team. -- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation). \ No newline at end of file +- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation). diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md index 980915a2c42..5301dae396d 100644 --- a/website/docs/docs/introduction.md +++ b/website/docs/docs/introduction.md @@ -61,7 +61,7 @@ As a dbt user, your main focus will be on writing models (select queries) that r | Handle boilerplate code to materialize queries as relations | For each model you create, you can easily configure a *materialization*. A materialization represents a build strategy for your select query – the code behind a materialization is robust, boilerplate SQL that wraps your select query in a statement to create a new, or update an existing, relation. Read more about [Materializations](/docs/build/materializations).| | Use a code compiler | SQL files can contain Jinja, a lightweight templating language. Using Jinja in SQL provides a way to use control structures in your queries. For example, `if` statements and `for` loops. It also enables repeated SQL to be shared through `macros`. Read more about [Macros](/docs/build/jinja-macros).| | Determine the order of model execution | Often, when transforming data, it makes sense to do so in a staged approach. dbt provides a mechanism to implement transformations in stages through the [ref function](/reference/dbt-jinja-functions/ref). Rather than selecting from existing tables and views in your warehouse, you can select from another model.| -| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/collaborate/documentation).| +| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/build/documentation).| | Test your models | Tests provide a way to improve the integrity of the SQL in each model by making assertions about the results generated by a model. Build, test, and run your project with a button click or by using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) command bar. Read more about writing tests for your models [Testing](/docs/build/data-tests)| | Manage packages | dbt ships with a package manager, which allows analysts to use and publish both public and private repositories of dbt code which can then be referenced by others. Read more about [Package Management](/docs/build/packages). | | Load seed files| Often in analytics, raw values need to be mapped to a more readable value (for example, converting a country-code to a country name) or enriched with static or infrequently changing data. These data sources, known as seed files, can be saved as a CSV file in your `project` and loaded into your data warehouse using the `seed` command. Read more about [Seeds](/docs/build/seeds).| diff --git a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md index f1e631f0d78..9e254de92d8 100644 --- a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md +++ b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md @@ -8,7 +8,7 @@ You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud - **dbt Cloud**: A hosted application where you can develop directly from a web browser using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). It also natively supports developing using a command line interface, [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). Among other features, dbt Cloud provides: - Development environment to help you build, test, run, and [version control](/docs/collaborate/git-version-control) your project faster. - - Share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team. + - Share your [dbt project's documentation](/docs/build/documentation) with your team. - Integrates with the dbt Cloud IDE, allowing you to run development tasks and environment in the dbt Cloud UI for a seamless experience. - The dbt Cloud CLI to develop and run dbt commands against your dbt Cloud development environment from your local command line. - For more details, refer to [Develop dbt](/docs/cloud/about-develop-dbt). diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-cache.md b/website/docs/docs/use-dbt-semantic-layer/sl-cache.md index e88c753ca82..4faa297f4ee 100644 --- a/website/docs/docs/use-dbt-semantic-layer/sl-cache.md +++ b/website/docs/docs/use-dbt-semantic-layer/sl-cache.md @@ -18,7 +18,7 @@ While you can use caching to speed up your queries and reduce compute time, know ## Prerequisites - dbt Cloud [Team or Enterprise](https://www.getdbt.com/) plan. -- dbt Cloud environments on dbt version 1.8 or higher. Or select [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). +- dbt Cloud environments that are versionless by opting to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). - A successful job run and [production environment](/docs/deploy/deploy-environments#set-as-production-environment). - For declarative caching, you need to have [exports](/docs/use-dbt-semantic-layer/exports) defined in your [saved queries](/docs/build/saved-queries) YAML configuration file. diff --git a/website/docs/faqs/Docs/_category_.yaml b/website/docs/faqs/Docs/_category_.yaml index 8c7925dcc15..0a9aa44fe56 100644 --- a/website/docs/faqs/Docs/_category_.yaml +++ b/website/docs/faqs/Docs/_category_.yaml @@ -1,10 +1,10 @@ # position: 2.5 # float position is supported -label: 'dbt Docs' +label: 'Documentation' collapsible: true # make the category collapsible collapsed: true # keep the category collapsed by default className: red link: type: generated-index - title: dbt Docs FAQs + title: Documentation FAQs customProps: - description: Frequently asked questions about dbt Docs + description: Frequently asked questions about documentation diff --git a/website/docs/faqs/Docs/long-descriptions.md b/website/docs/faqs/Docs/long-descriptions.md index cdf15a94120..ef410df0517 100644 --- a/website/docs/faqs/Docs/long-descriptions.md +++ b/website/docs/faqs/Docs/long-descriptions.md @@ -31,4 +31,5 @@ If you need more than a sentence to explain a model, you can: * tempor incididunt ut labore et dolore magna aliqua. ``` -3. Use a [docs block](/docs/collaborate/documentation#using-docs-blocks) to write the description in a separate Markdown file. +3. Use a [docs block](/docs/build/documentation#using-docs-blocks) to write the description in a separate Markdown file. +b diff --git a/website/docs/faqs/Docs/sharing-documentation.md b/website/docs/faqs/Docs/sharing-documentation.md index 4c6e0e84f77..cff618586ea 100644 --- a/website/docs/faqs/Docs/sharing-documentation.md +++ b/website/docs/faqs/Docs/sharing-documentation.md @@ -1,8 +1,12 @@ --- -title: How do I share my documentation with my team members? +title: How do I access documentation in dbt Explorer? description: "Use read-only seats to share documentation" -sidebar_label: 'Share documentation with teammates' +sidebar_label: 'Access documentation in dbt Explorer' id: sharing-documentation --- -If you're using dbt Cloud to deploy your project, and have the [Team plan](https://www.getdbt.com/pricing/), you can have up to 5 read-only users, who will be able access the documentation for your project. +If you're using dbt Cloud to deploy your project and have the [Team or Enterprise plan](https://www.getdbt.com/pricing/), you can use dbt Explorer to view your project's [resources](/docs/build/projects) (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. + +Access dbt Explorer in dbt Cloud by clicking the **Explore** link in the navigation. You can have up to 5 read-only users access the documentation for your project. + +dbt Cloud developer plan and dbt Core users can use [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), which generates basic documentation but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. diff --git a/website/docs/faqs/Git/github-permissions.md b/website/docs/faqs/Git/github-permissions.md new file mode 100644 index 00000000000..e3a1740bbab --- /dev/null +++ b/website/docs/faqs/Git/github-permissions.md @@ -0,0 +1,34 @@ +--- +title: "I'm seeing a 'GitHub and dbt Cloud latest permissions' error" +description: "GitHub and dbt Cloud permissions error" +sidebar_label: "GitHub and dbt Cloud permissions error" +--- + +If you see the error `This account needs to accept the latest permissions for the dbt Cloud GitHub App` in dbt Cloud — this usually occurs when the permissions for the dbt Cloud GitHub App are out of date. + +To solve this issue, you'll need to update the permissions for the dbt Cloud GitHub App in your GitHub account. Here's a couple of ways you can do it: + +#### Update permissions + +A Github organization admin will need to update the permissions in GitHub for the dbt Cloud GitHub App. If you're not the admin, reach out to your organization admin to request this. Alternatively, try [disconecting your GitHub account](#disconect-github) in dbt Cloud. + +1. Go directly to GitHub to determine if any updated permissions are required. +2. In GitHub, go to your organization **Settings** (or personal if using a non-organization account). +3. Then navigate to **Applications** to identify any necessary permission changes. +For more info on GitHub permissions, refer to [access permissions](https://docs.github.com/en/get-started/learning-about-github/access-permissions-on-github). + +#### Disconnect GitHub + +Disconnect the GitHub and dbt Cloud integration in dbt Cloud. + +1. In dbt Cloud, go to **Account Settings**. +2. In **Projects**, select the project that's experiencing the issue. +3. Click the repository link under **Repository**. +4. In the **Repository details** page, click **Edit**. +5. Click **Disconnect** to remove the GitHub integration. +6. Go back to your **Project details** page and reconnect your repository by clicking the **Configure Repository** link. +7. Configure your repository and click **Save** + + + +If you've tried these workarounds and are still experiencing this behavior — reach out to the [dbt Support](mailto:support@getdbt.com) team and we'll be happy to help! diff --git a/website/docs/faqs/Git/gitlab-authentication.md b/website/docs/faqs/Git/gitlab-authentication.md index 0debdf87873..1d6de32fb6f 100644 --- a/website/docs/faqs/Git/gitlab-authentication.md +++ b/website/docs/faqs/Git/gitlab-authentication.md @@ -9,7 +9,7 @@ If you're seeing a 'GitLab Authentication is out of date' 500 server error page No worries - this is a current issue the dbt Labs team is working on and we have a few workarounds for you to try: -### 1st Workaround +#### First workaround 1. Disconnect repo from project in dbt Cloud. 2. Go to Gitlab and click on Settings > Repository. @@ -18,7 +18,7 @@ No worries - this is a current issue the dbt Labs team is working on and we have 5. You would then need to check Gitlab to make sure that the new deploy key is added. 6. Once confirmed that it's added, refresh dbt Cloud and try developing once again. -### 2nd Workaround +#### Second workaround 1. Keep repo in project as is -- don't disconnect. 2. Copy the deploy key generated in dbt Cloud. diff --git a/website/docs/faqs/Git/run-on-pull.md b/website/docs/faqs/Git/run-on-pull.md index 3536259bb79..d1b6bfd7524 100644 --- a/website/docs/faqs/Git/run-on-pull.md +++ b/website/docs/faqs/Git/run-on-pull.md @@ -12,4 +12,3 @@ If it was added via a deploy key method, you'll want to use the [GitHub auth me To go ahead and enable 'Run on Pull requests', you'll want to remove dbt Cloud from the Apps & Integration on GitHub and re-integrate it again via the GitHub app method. If you've tried the workaround above and are still experiencing this behavior - reach out to the Support team at support@getdbt.com and we'll be happy to help! - diff --git a/website/docs/guides/building-packages.md b/website/docs/guides/building-packages.md index cc1ee2f1d74..69f963049ad 100644 --- a/website/docs/guides/building-packages.md +++ b/website/docs/guides/building-packages.md @@ -108,7 +108,7 @@ The major exception to this is when working with data sources that benefit from ### Test and document your package It's critical that you [test](/docs/build/data-tests) your models and sources. This will give your end users confidence that your package is actually working on top of their dataset as intended. -Further, adding [documentation](/docs/collaborate/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package. +Further, adding [documentation](/docs/build/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package. ### Include useful GitHub artifacts Over time, we've developed a set of useful GitHub artifacts that make administering our packages easier for us. In particular, we ensure that we include: - A useful README, that has: @@ -172,4 +172,4 @@ The release notes should contain an overview of the changes introduced in the ne Our package registry, [hub.getdbt.com](https://hub.getdbt.com/), gets updated by the [hubcap script](https://github.com/dbt-labs/hubcap). To add your package to hub.getdbt.com, create a PR on the [hubcap repository](https://github.com/dbt-labs/hubcap) to include it in the `hub.json` file. -
\ No newline at end of file + diff --git a/website/docs/guides/core-cloud-2.md b/website/docs/guides/core-cloud-2.md index a4683ddb6f8..335b164d988 100644 --- a/website/docs/guides/core-cloud-2.md +++ b/website/docs/guides/core-cloud-2.md @@ -141,9 +141,9 @@ After [setting the foundations of dbt Cloud](https://docs.getdbt.com/guides/core Once you’ve confirmed that dbt Cloud orchestration and CI/CD are working as expected, you should pause your current orchestration tool and stop or update your current CI/CD process. This is not relevant if you’re still using an external orchestrator (such as Airflow), and you’ve swapped out `dbt-core` execution for dbt Cloud execution (through the [API](/docs/dbt-cloud-apis/overview)). Familiarize your team with dbt Cloud's [features](/docs/cloud/about-cloud/dbt-cloud-features) and optimize development and deployment processes. Some key features to consider include: -- **Version management:** Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, removing the hassle of manual updates and version discrepancies. You can **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest fixes and early access to new functionality for your dbt project. +- **Version management:** Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, removing the hassle of manual updates and version discrepancies. You can go versionless by opting to **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest features and early access to new functionality for your dbt project. - **Development tools**: Use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to build, test, run, and version control your dbt projects. -- **Documentation and Source freshness:** Automate storage of [documentation](/docs/collaborate/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance. +- **Documentation and Source freshness:** Automate storage of [documentation](/docs/build/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance. - **Notifications and logs:** Receive immediate [notifications](/docs/deploy/monitor-jobs) for job failures, with direct links to the job details. Access comprehensive logs for all job runs to help with troubleshooting. - **CI/CD:** Use dbt Cloud's [CI/CD](/docs/deploy/ci-jobs) feature to run your dbt projects in a temporary schema whenever new commits are pushed to open pull requests. This helps with catching bugs before deploying to production. diff --git a/website/docs/guides/core-to-cloud-1.md b/website/docs/guides/core-to-cloud-1.md index 0a7dbf4dac8..6e130d3a29f 100644 --- a/website/docs/guides/core-to-cloud-1.md +++ b/website/docs/guides/core-to-cloud-1.md @@ -206,7 +206,7 @@ To use the [dbt Cloud's job scheduler](/docs/deploy/job-scheduler), set up one e ### Initial setup steps 1. **dbt Core version** — In your environment settings, configure dbt Cloud with the same dbt Core version. - - Once your full migration is complete, we recommend upgrading your environments to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the latest features and more. You only need to do this once. + - Once your full migration is complete, we recommend upgrading your environments to a versionless experience by opting to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the latest features and more. You only need to do this once. 2. **Configure your jobs** — [Create jobs](/docs/deploy/deploy-jobs#create-and-schedule-jobs) for scheduled or event-driven dbt jobs. You can use cron execution, manual, pull requests, or trigger on the completion of another job. - Note that alongside [jobs in dbt Cloud](/docs/deploy/jobs), discover other ways to schedule and run your dbt jobs with the help of other tools. Refer to [Integrate with other tools](/docs/deploy/deployment-tools) for more information. diff --git a/website/docs/guides/core-to-cloud-3.md b/website/docs/guides/core-to-cloud-3.md index 8e77ae8ab15..0b63756a41a 100644 --- a/website/docs/guides/core-to-cloud-3.md +++ b/website/docs/guides/core-to-cloud-3.md @@ -36,7 +36,7 @@ You may have already started your move to dbt Cloud and are looking for tips to In dbt Cloud, you can natively connect to your data platform and test its [connection](/docs/connect-adapters) with a click of a button. This is especially useful for users who are new to dbt Cloud or are looking to streamline their connection setup. Here are some tips and caveats to consider: ### Tips -- Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, eliminating the need for manual updates and version discrepancies. You can **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest fixes and early access to new functionality for your dbt project. +- Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, eliminating the need for manual updates and version discrepancies. You can go versionless and **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest features and early access to new functionality for your dbt project. - dbt Cloud supports a whole host of [cloud providers](/docs/cloud/connect-data-platform/about-connections), including Snowflake, Databricks, BigQuery, Fabric, and Redshift (to name a few). - Use [Extended Attributes](/docs/deploy/deploy-environments#extended-attributes) to set a flexible [profiles.yml](/docs/core/connect-data-platform/profiles.yml) snippet in your dbt Cloud environment settings. It gives you more control over environments (both deployment and development) and extends how dbt Cloud connects to the data platform within a given environment. - For example, if you have a field in your `profiles.yml` that you’d like to add to the dbt Cloud adapter user interface, you can use Extended Attributes to set it. diff --git a/website/docs/guides/dbt-python-snowpark.md b/website/docs/guides/dbt-python-snowpark.md index f6d54ee738f..8125f98d231 100644 --- a/website/docs/guides/dbt-python-snowpark.md +++ b/website/docs/guides/dbt-python-snowpark.md @@ -1858,7 +1858,7 @@ We are going to revisit 2 areas of our project to understand our documentation: - `intermediate.md` file - `dbt_project.yml` file -To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/collaborate/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels. +To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/build/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels. 1. To see all these pieces come together, execute this in the command bar: @@ -1926,4 +1926,4 @@ Fantastic! You’ve finished the workshop! We hope you feel empowered in using b For more help and information join our [dbt community Slack](https://www.getdbt.com/community/) which contains more than 50,000 data practitioners today. We have a dedicated slack channel #db-snowflake to Snowflake related content. Happy dbt'ing! - \ No newline at end of file + diff --git a/website/docs/guides/mesh-qs.md b/website/docs/guides/mesh-qs.md index be6f2ca205e..d43e2516d23 100644 --- a/website/docs/guides/mesh-qs.md +++ b/website/docs/guides/mesh-qs.md @@ -40,7 +40,7 @@ To leverage dbt Mesh, you need the following: - You must have a [dbt Cloud Enterprise account](https://www.getdbt.com/get-started/enterprise-contact-pricing) - You have access to a cloud data platform, permissions to load the sample data tables, and dbt Cloud permissions to create new projects. -- Set your development and deployment [environments](/docs/dbt-cloud-environments) to use dbt [version](/docs/dbt-versions/core) 1.6 or later. You can also opt [Keep on latest version of](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always use the latest version of dbt. +- Set your development and deployment [environments](/docs/dbt-cloud-environments) to use dbt [version](/docs/dbt-versions/core) 1.6 or later. You can also opt to go versionless and select [Keep on latest version of](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the most recent features and functionality. - This guide uses the Jaffle Shop sample data, including `customers`, `orders`, and `payments` tables. Follow the provided instructions to load this data into your respective data platform: - [Snowflake](https://docs.getdbt.com/guides/snowflake?step=3) - [Databricks](https://docs.getdbt.com/guides/databricks?step=3) diff --git a/website/docs/guides/productionize-your-dbt-databricks-project.md b/website/docs/guides/productionize-your-dbt-databricks-project.md index 33f25070bdb..bada787e01f 100644 --- a/website/docs/guides/productionize-your-dbt-databricks-project.md +++ b/website/docs/guides/productionize-your-dbt-databricks-project.md @@ -197,4 +197,4 @@ To get the most out of both tools, you can use the [persist docs config](/refere - [Databricks + dbt Cloud Quickstart Guide](/guides/databricks) - Reach out to your Databricks account team to get access to preview features on Databricks. - \ No newline at end of file + diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md index 4310710383c..9fb42fe1828 100644 --- a/website/docs/guides/sl-snowflake-qs.md +++ b/website/docs/guides/sl-snowflake-qs.md @@ -114,7 +114,7 @@ Open a new tab and follow these quick steps for account setup and data loading i -- Production and development environments must be on [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-dbt-version-in-cloud). Alternatively, set your environment to[ Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always remain on the latest version. +- Production and development environments must be on [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-dbt-version-in-cloud). Alternatively, set your environment to "versionless" by selecting [ Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) to always get the latest updates. - Create a [trial Snowflake account](https://signup.snowflake.com/): - Select the Enterprise Snowflake edition with ACCOUNTADMIN access. Consider organizational questions when choosing a cloud provider, refer to Snowflake's [Introduction to Cloud Platforms](https://docs.snowflake.com/en/user-guide/intro-cloud-platforms). - Select a cloud provider and region. All cloud providers and regions will work so choose whichever you prefer. @@ -698,10 +698,10 @@ semantic_models: type: foreign # Newly added dimensions: - - name: order_date - type: time - type_params: - time_granularity: day + - name: order_date + type: time + type_params: + time_granularity: day ```
diff --git a/website/docs/reference/artifacts/catalog-json.md b/website/docs/reference/artifacts/catalog-json.md index 44a3f980c60..54f0c93da90 100644 --- a/website/docs/reference/artifacts/catalog-json.md +++ b/website/docs/reference/artifacts/catalog-json.md @@ -7,7 +7,7 @@ sidebar_label: "Catalog" **Produced by:** [`docs generate`](/reference/commands/cmd-docs) -This file contains information from your about the tables and views produced and defined by the resources in your project. Today, dbt uses this file to populate metadata, such as column types and statistics, in the [docs site](/docs/collaborate/documentation). +This file contains information from your about the tables and views produced and defined by the resources in your project. Today, dbt uses this file to populate metadata, such as column types and statistics, in the [docs site](/docs/collaborate/build-and-view-your-docs). ### Top-level keys diff --git a/website/docs/reference/artifacts/dbt-artifacts.md b/website/docs/reference/artifacts/dbt-artifacts.md index 5e801d31b16..8d3e1ae29e8 100644 --- a/website/docs/reference/artifacts/dbt-artifacts.md +++ b/website/docs/reference/artifacts/dbt-artifacts.md @@ -5,7 +5,7 @@ sidebar_label: "About dbt artifacts" With every invocation, dbt generates and saves one or more *artifacts*. Several of these are files (`semantic_manifest.json`, `manifest.json`, `catalog.json`, `run_results.json`, and `sources.json`) that are used to power: -- [documentation](/docs/collaborate/documentation) +- [documentation](/docs/collaborate/build-and-view-your-docs) - [state](/reference/node-selection/syntax#about-node-selection) - [visualizing source freshness](/docs/build/sources#snapshotting-source-data-freshness) diff --git a/website/docs/reference/artifacts/manifest-json.md b/website/docs/reference/artifacts/manifest-json.md index 5a487f2f177..296b5250d5d 100644 --- a/website/docs/reference/artifacts/manifest-json.md +++ b/website/docs/reference/artifacts/manifest-json.md @@ -11,7 +11,7 @@ import ManifestVersions from '/snippets/_manifest-versions.md'; This single file contains a full representation of your dbt project's resources (models, tests, macros, etc), including all node configurations and resource properties. Even if you're only running some models or tests, all resources will appear in the manifest (unless they are disabled) with most of their properties. (A few node properties, such as `compiled_sql`, only appear for executed nodes.) -Today, dbt uses this file to populate the [docs site](/docs/collaborate/documentation), and to perform [state comparison](/reference/node-selection/syntax#about-node-selection). Members of the community have used this file to run checks on how many models have descriptions and tests. +Today, dbt uses this file to populate the [docs site](/docs/collaborate/build-and-view-your-docs), and to perform [state comparison](/reference/node-selection/syntax#about-node-selection). Members of the community have used this file to run checks on how many models have descriptions and tests. ### Top-level keys diff --git a/website/docs/reference/artifacts/other-artifacts.md b/website/docs/reference/artifacts/other-artifacts.md index c4e595782fc..75a4653d685 100644 --- a/website/docs/reference/artifacts/other-artifacts.md +++ b/website/docs/reference/artifacts/other-artifacts.md @@ -7,7 +7,7 @@ sidebar_label: "Other artifacts" **Produced by:** [`docs generate`](/reference/commands/cmd-docs) -This file is the skeleton of the [auto-generated dbt documentation website](/docs/collaborate/documentation). The contents of the site are populated by the [manifest](/reference/artifacts/manifest-json) and [catalog](catalog-json). +This file is the skeleton of the [auto-generated dbt documentation website](/docs/collaborate/build-and-view-your-docs). The contents of the site are populated by the [manifest](/reference/artifacts/manifest-json) and [catalog](catalog-json). Note: the source code for `index.json` comes from the [dbt-docs repo](https://github.com/dbt-labs/dbt-docs). Head over there if you want to make a bug report, suggestion, or contribution relating to the documentation site. diff --git a/website/docs/reference/commands/cmd-docs.md b/website/docs/reference/commands/cmd-docs.md index 176bd4106cd..60b3049ccf2 100644 --- a/website/docs/reference/commands/cmd-docs.md +++ b/website/docs/reference/commands/cmd-docs.md @@ -42,7 +42,7 @@ dbt docs generate --no-compile Use the `--empty-catalog` argument to skip running the database queries to populate `catalog.json`. When this flag is provided, `dbt docs generate` will skip step (3) described above. -This is not recommended for production environments, as it means that your documentation will be missing information gleaned from database metadata (the full set of columns in each table, and statistics about those tables). It can speed up `docs generate` in development, when you just want to visualize lineage and other information defined within your project. To learn how to build your documentation in dbt Cloud, refer to [build your docs in dbt Cloud](/docs/collaborate/build-and-view-your-docs#generating-documentation). +This is not recommended for production environments, as it means that your documentation will be missing information gleaned from database metadata (the full set of columns in each table, and statistics about those tables). It can speed up `docs generate` in development, when you just want to visualize lineage and other information defined within your project. To learn how to build your documentation in dbt Cloud, refer to [build your docs in dbt Cloud](/docs/collaborate/build-and-view-your-docs). **Example**: ``` diff --git a/website/docs/reference/dbt-jinja-functions/doc.md b/website/docs/reference/dbt-jinja-functions/doc.md index 51ca6ad2059..ee0b75b2e19 100644 --- a/website/docs/reference/dbt-jinja-functions/doc.md +++ b/website/docs/reference/dbt-jinja-functions/doc.md @@ -5,7 +5,7 @@ id: "doc" description: "Use the `doc` to reference docs blocks in description fields." --- -The `doc` function is used to reference docs blocks in the description field of schema.yml files. It is analogous to the `ref` function. For more information, consult the [Documentation guide](/docs/collaborate/documentation). +The `doc` function is used to reference docs blocks in the description field of schema.yml files. It is analogous to the `ref` function. For more information, consult the [Documentation guide](/docs/collaborate/build-and-view-your-docs). Usage: diff --git a/website/docs/reference/global-configs/legacy-behaviors.md b/website/docs/reference/global-configs/legacy-behaviors.md index cd898d55187..e6b93d1d2f4 100644 --- a/website/docs/reference/global-configs/legacy-behaviors.md +++ b/website/docs/reference/global-configs/legacy-behaviors.md @@ -33,7 +33,7 @@ flags: -When we use dbt Cloud in the following table, we're referring to "[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)." +When we use dbt Cloud in the following table, we're referring to accounts that have gone versionless by opting to "[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)." | Flag | dbt Cloud: Intro | dbt Cloud: Maturity | dbt Core: Intro | dbt Core: Maturity | |-----------------------------------------------------------------|------------------|---------------------|-----------------|--------------------| diff --git a/website/docs/reference/global-configs/resource-type.md b/website/docs/reference/global-configs/resource-type.md index 2d7672d108d..eb3562b5175 100644 --- a/website/docs/reference/global-configs/resource-type.md +++ b/website/docs/reference/global-configs/resource-type.md @@ -18,7 +18,7 @@ The available resource types are: - [`analysis`](/docs/build/analyses) - [`exposure`](/docs/build/exposures) -- [`metric`](/docs/build/build-metrics-intro) +- [`metric`](/docs/build/metrics-overview) - [`model`](/docs/build/models) - [`seed`](/docs/build/seeds) - [`snapshot`](/docs/build/snapshots) diff --git a/website/docs/reference/project-configs/docs-paths.md b/website/docs/reference/project-configs/docs-paths.md index 910cfbb0cce..51ff5c5ccca 100644 --- a/website/docs/reference/project-configs/docs-paths.md +++ b/website/docs/reference/project-configs/docs-paths.md @@ -13,7 +13,7 @@ docs-paths: [directorypath] ## Definition -Optionally specify a custom list of directories where [docs blocks](/docs/collaborate/documentation#docs-blocks) are located. +Optionally specify a custom list of directories where [docs blocks](/docs/build/documentation#docs-blocks) are located. ## Default diff --git a/website/docs/reference/project-configs/query-comment.md b/website/docs/reference/project-configs/query-comment.md index cf4c8dc49f0..7e654350306 100644 --- a/website/docs/reference/project-configs/query-comment.md +++ b/website/docs/reference/project-configs/query-comment.md @@ -19,7 +19,7 @@ The `query-comment` configuration also accepts a dictionary input, like so: ```yml models: my_dbt_project: - +materliazed: table + +materialized: table query-comment: comment: string diff --git a/website/docs/reference/resource-configs/fabric-configs.md b/website/docs/reference/resource-configs/fabric-configs.md index d73dc9500a4..8ab0a63a644 100644 --- a/website/docs/reference/resource-configs/fabric-configs.md +++ b/website/docs/reference/resource-configs/fabric-configs.md @@ -101,4 +101,5 @@ Not supported at this time. ## dbt-utils -Not supported at this time +Not supported at this time. However, dbt-fabric offers some utils macros. Please check out [utils macros](https://github.com/microsoft/dbt-fabric/tree/main/dbt/include/fabric/macros/utils). + diff --git a/website/docs/reference/resource-properties/description.md b/website/docs/reference/resource-properties/description.md index fee1d50aaf3..ce0c7c42074 100644 --- a/website/docs/reference/resource-properties/description.md +++ b/website/docs/reference/resource-properties/description.md @@ -157,7 +157,7 @@ A user-defined description. Can be used to document: - analyses, and analysis columns - macros, and macro arguments -These descriptions are used in the documentation website rendered by dbt (refer to [the documentation guide](/docs/collaborate/documentation) or [dbt Explorer](/docs/collaborate/explore-projects)). +These descriptions are used in the documentation website rendered by dbt (refer to [the documentation guide](/docs/build/documentation) or [dbt Explorer](/docs/collaborate/explore-projects)). Descriptions can include markdown, as well as the [`doc` jinja function](/reference/dbt-jinja-functions/doc). diff --git a/website/docs/reference/resource-properties/unit-tests.md b/website/docs/reference/resource-properties/unit-tests.md index 227edf22dbb..3c6e6eef0a6 100644 --- a/website/docs/reference/resource-properties/unit-tests.md +++ b/website/docs/reference/resource-properties/unit-tests.md @@ -7,7 +7,7 @@ datatype: test :::note -This functionality is only supported in dbt Core v1.8+ or dbt Cloud accounts that have opted to ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). +This functionality is only supported in dbt Core v1.8+ or dbt Cloud accounts that have gone versionless by opting to ["Keep on latest version"](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version). ::: diff --git a/website/docs/terms/dag.md b/website/docs/terms/dag.md index 0216332d953..93e2956ebb3 100644 --- a/website/docs/terms/dag.md +++ b/website/docs/terms/dag.md @@ -79,7 +79,7 @@ Instead of manually auditing your DAG for best practices, the [dbt project evalu ## dbt and DAGs -The marketing team at dbt Labs would be upset with us if we told you we think dbt actually stood for “dag build tool,” but one of the key elements of dbt is its ability to generate documentation and infer relationships between models. And one of the hallmark features of [dbt Docs](https://docs.getdbt.com/docs/collaborate/documentation) is the Lineage Graph (DAG) of your dbt project. +The marketing team at dbt Labs would be upset with us if we told you we think dbt actually stood for “dag build tool,” but one of the key elements of dbt is its ability to generate documentation and infer relationships between models. And one of the hallmark features of [dbt Docs](https://docs.getdbt.com/docs/build/documentation) is the Lineage Graph (DAG) of your dbt project. Whether you’re using dbt Core or Cloud, dbt docs and the Lineage Graph are available to all dbt developers. The Lineage Graph in dbt Docs can show a model or source’s entire lineage, all within a visual frame. Clicking within a model, you can view the Lineage Graph and adjust selectors to only show certain models within the DAG. Analyzing the DAG here is a great way to diagnose potential inefficiencies or lack of modularity in your dbt project. diff --git a/website/docs/terms/dry.md b/website/docs/terms/dry.md index ec1c9229567..04b83642a08 100644 --- a/website/docs/terms/dry.md +++ b/website/docs/terms/dry.md @@ -26,7 +26,7 @@ WET, which stands for “Write Everything Twice,” is the opposite of DRY. It's Well, how would you know if your code isn't DRY enough? That’s kind of subjective and will vary by the norms set within your organization. That said, a good rule of thumb is [the Rule of Three](https://en.wikipedia.org/wiki/Rule_of_three_(writing)#:~:text=The%20rule%20of%20three%20is,or%20effective%20than%20other%20numbers.). This rule states that the _third_ time you encounter a certain pattern, you should probably abstract it into some reusable unit. -There is, of course, a tradeoff between simplicity and conciseness in code. The more abstractions you create, the harder it can be for others to understand and maintain your code without proper documentation. So, the moral of the story is: DRY code is great as long as you [write great documentation.](https://docs.getdbt.com/docs/collaborate/documentation) +There is, of course, a tradeoff between simplicity and conciseness in code. The more abstractions you create, the harder it can be for others to understand and maintain your code without proper documentation. So, the moral of the story is: DRY code is great as long as you [write great documentation.](https://docs.getdbt.com/docs/build/documentation) ### Save time & energy diff --git a/website/docs/terms/primary-key.md b/website/docs/terms/primary-key.md index fde3ff44ac7..c8fc327af0d 100644 --- a/website/docs/terms/primary-key.md +++ b/website/docs/terms/primary-key.md @@ -151,7 +151,7 @@ When we talk about testing our primary keys, we really mean testing their unique 2. For databases that don’t offer support and enforcement of primary keys, you’re going to need to regularly test that primary keys aren’t violating their golden rule of uniqueness and non-nullness. To do this, we recommend implementing a tool like dbt that allows you to define version-controlled and code-based tests on your data models. Using these tests, you should create [not null](https://docs.getdbt.com/reference/resource-properties/tests#not_null) and [unique](https://docs.getdbt.com/reference/resource-properties/tests#unique) tests for every primary key field throughout your dbt project. Other methods for primary key testing may look like writing custom tests or ad hoc queries that check for uniqueness and non-nullness. :::tip Tip -You can use dbt’s [documentation](https://docs.getdbt.com/docs/collaborate/documentation) and [testing](https://docs.getdbt.com/reference/resource-properties/tests) capabilities to clearly identify and QA primary keys in your data models. For your primary key column, you should mention that the field is the unique identifier for that table and test for uniqueness and non-nullness. +You can use dbt’s [documentation](https://docs.getdbt.com/docs/build/documentation) and [testing](https://docs.getdbt.com/reference/resource-properties/tests) capabilities to clearly identify and QA primary keys in your data models. For your primary key column, you should mention that the field is the unique identifier for that table and test for uniqueness and non-nullness. ::: ## Conclusion diff --git a/website/sidebars.js b/website/sidebars.js index 37d4dae1b7c..a087c24e3f7 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -301,6 +301,7 @@ const sidebarSettings = { "docs/build/unit-tests", ], }, + "docs/build/documentation", "docs/build/snapshots", "docs/build/seeds", "docs/build/jinja-macros", @@ -468,7 +469,7 @@ const sidebarSettings = { "docs/collaborate/collaborate-with-others", { type: "category", - label: "Explore dbt projects", + label: "Discover data with dbt Explorer", link: { type: "doc", id: "docs/collaborate/explore-projects" }, items: [ "docs/collaborate/explore-projects", @@ -476,6 +477,7 @@ const sidebarSettings = { "docs/collaborate/model-performance", "docs/collaborate/project-recommendations", "docs/collaborate/explore-multiple-projects", + "docs/collaborate/access-from-dbt-cloud", "docs/collaborate/dbt-explorer-faqs", ], }, @@ -493,10 +495,9 @@ const sidebarSettings = { }, { type: "category", - label: "Document your dbt projects", - link: { type: "doc", id: "docs/collaborate/documentation" }, + label: "Document your projects", + link: { type: "doc", id: "docs/collaborate/build-and-view-your-docs" }, items: [ - "docs/collaborate/documentation", "docs/collaborate/build-and-view-your-docs", ], }, diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md index 7b28d2a6539..5013f9763ff 100644 --- a/website/snippets/_cloud-environments-info.md +++ b/website/snippets/_cloud-environments-info.md @@ -35,7 +35,7 @@ Both development and deployment environments have a section called **General Set - dbt Cloud allows users to select any dbt release. At this time, **environments must use a dbt version greater than or equal to v1.0.0;** [lower versions are no longer supported](/docs/dbt-versions/upgrade-dbt-version-in-cloud). - If you select a current version with `(latest)` in the name, your environment will automatically install the latest stable version of the minor version selected. -- In 2024 we are introducing **Keep on latest version**, which removes the need for manually upgrading environments in the future, while ensuring you get access to the latest fixes and features. This feature is currently in beta for select customers, rolling out to wider availability through February and March._ +- Go versionless by opting to **Keep on latest version**, which removes the need for manually upgrading environment, while ensuring you are always up to date with the latest fixes and features. ::: ### Custom branch behavior @@ -47,38 +47,42 @@ By default, all environments will use the default branch in your repository (usu For more info, check out this [FAQ page on this topic](/faqs/Environments/custom-branch-settings)! - ### Extended attributes :::note -Extended attributes are retrieved and applied only at runtime when `profiles.yml` is requested for a specific Cloud run. Extended attributes are currently _not_ taken into consideration for SSH Tunneling which does not rely on `profiles.yml` values. +Extended attributes are are currently _not_ supported for SSH tunneling ::: -Extended Attributes is a feature that allows users to set a flexible [profiles.yml](/docs/core/connect-data-platform/profiles.yml) snippet in their dbt Cloud Environment settings. It provides users with more control over environments (both deployment and development) and extends how dbt Cloud connects to the data platform within a given environment. - -Extended Attributes is a text box extension at the environment level that overrides connection or environment credentials, including any custom environment variables. You can set any YAML attributes that a dbt adapter accepts in its `profiles.yml`. +Extended attributes allows users to set a flexible [profiles.yml](/docs/core/connect-data-platform/profiles.yml) snippet in their dbt Cloud Environment settings. It provides users with more control over environments (both deployment and development) and extends how dbt Cloud connects to the data platform within a given environment. -Something to note, Extended Attributes don't mask secret values. We recommend avoiding setting secret values to prevent visibility in the text box and logs. +Extended attributes are set at the environment level, and can partially override connection or environment credentials, including any custom environment variables. You can set any YAML attributes that a dbt adapter accepts in its `profiles.yml`.
-If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), or [orchestrating job runs](/docs/deploy/deployments), Extended Attributes parses through the provided YAML and extracts the `profiles.yml` attributes. For each individual attribute: - -- If the attribute exists in another source (such as your project settings), it will replace its value (like environment-level values) in the profile. It also overrides any custom environment variables. - -- If the attribute doesn't exist, it will add the attribute or value pair to the profile. - -Only the **top-level keys** are accepted in extended attributes. This means that if you want to change a specific sub-key value, you must provide the entire top-level key as a JSON block in your resulting YAML. For example, if you want to customize a particular field within a [service account JSON](/docs/core/connect-data-platform/bigquery-setup#service-account-json) for your BigQuery connection (like 'project_id' or 'client_email'), you need to provide an override for the entire top-level `keyfile_json` main key/attribute using extended attributes. Include the sub-fields as a nested JSON block. - The following code is an example of the types of attributes you can add in the **Extended Attributes** text box: ```yaml dbname: jaffle_shop schema: dbt_alice threads: 4 +username: alice +password: '{{ env_var(''DBT_ENV_SECRET_PASSWORD'') }}' ``` -### Git repository caching +#### Extended Attributes don't mask secret values +We recommend avoiding setting secret values to prevent visibility in the text box and logs. A common workaround is to wrap extended attributes in [environment variables](/docs/build/environment-variables). In the earlier example, `password: '{{ env_var(''DBT_ENV_SECRET_PASSWORD'') }}'` will get a value from the `DBT_ENV_SECRET_PASSWORD` environment variable at runtime. + +#### How extended attributes work +If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud), [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), or [orchestrating job runs](/docs/deploy/deployments), extended attributes parses through the provided YAML and extracts the `profiles.yml` attributes. For each individual attribute: + +- If the attribute exists in another source (such as your project settings), it will replace its value (like environment-level values) in the profile. It also overrides any custom environment variables (if not itself wired using the syntax described for secrets above) + +- If the attribute doesn't exist, it will add the attribute or value pair to the profile. + +#### Only the **top-level keys** are accepted in extended attributes +This means that if you want to change a specific sub-key value, you must provide the entire top-level key as a JSON block in your resulting YAML. For example, if you want to customize a particular field within a [service account JSON](/docs/core/connect-data-platform/bigquery-setup#service-account-json) for your BigQuery connection (like 'project_id' or 'client_email'), you need to provide an override for the entire top-level `keyfile_json` main key/attribute using extended attributes. Include the sub-fields as a nested JSON block. + +### Git repository caching At the start of every job run, dbt Cloud clones the project's Git repository so it has the latest versions of your project's code and runs `dbt deps` to install your dependencies. diff --git a/website/snippets/_config-dbt-version-check.md b/website/snippets/_config-dbt-version-check.md index 5c21c0fa63f..b5283aae864 100644 --- a/website/snippets/_config-dbt-version-check.md +++ b/website/snippets/_config-dbt-version-check.md @@ -1,5 +1,5 @@ -Starting in 2024, when you select **Keep on latest version** in dbt Cloud, dbt will ignore the `require-dbt-version` config. Refer to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) for more details. +Starting in 2024, when you select **Keep on latest version** in dbt Cloud, dbt will ignore the `require-dbt-version` config. Refer to [Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version) for more details about going versionless. dbt Labs is committed to zero breaking changes for code in dbt projects, with ongoing releases to dbt Cloud and new versions of dbt Core. We also recommend these best practices: diff --git a/website/snippets/tutorial-document-your-models.md b/website/snippets/tutorial-document-your-models.md index 9913dbcd1d7..736ce567d57 100644 --- a/website/snippets/tutorial-document-your-models.md +++ b/website/snippets/tutorial-document-your-models.md @@ -1,4 +1,4 @@ -Adding [documentation](/docs/collaborate/documentation) to your project allows you to describe your models in rich detail, and share that information with your team. Here, we're going to add some basic documentation to our project. +Adding [documentation](/docs/build/documentation) to your project allows you to describe your models in rich detail, and share that information with your team. Here, we're going to add some basic documentation to our project. 1. Update your `models/schema.yml` file to include some descriptions, such as those below. diff --git a/website/snippets/tutorial-next-steps-tests.md b/website/snippets/tutorial-next-steps-tests.md index 39764cede0a..26fd356e87b 100644 --- a/website/snippets/tutorial-next-steps-tests.md +++ b/website/snippets/tutorial-next-steps-tests.md @@ -2,4 +2,4 @@ Before moving on from testing, make a change and see how it affects your results * Write a test that fails, for example, omit one of the order statuses in the `accepted_values` list. What does a failing test look like? Can you debug the failure? * Run the tests for one model only. If you grouped your `stg_` models into a directory, try running the tests for all the models in that directory. -* Use a [docs block](/docs/collaborate/documentation#using-docs-blocks) to add a Markdown description to a model. +* Use a [docs block](/docs/build/documentation#using-docs-blocks) to add a Markdown description to a model. diff --git a/website/static/img/blog/2024-05-07-unit-testing/unit-test-terminal-output.png b/website/static/img/blog/2024-05-07-unit-testing/unit-test-terminal-output.png deleted file mode 100644 index 9e68587fa61..00000000000 Binary files a/website/static/img/blog/2024-05-07-unit-testing/unit-test-terminal-output.png and /dev/null differ diff --git a/website/static/img/blog/2024-06-12-putting-your-dag-on-the-internet/.gitkeep b/website/static/img/blog/2024-06-12-putting-your-dag-on-the-internet/.gitkeep new file mode 100644 index 00000000000..8b137891791 --- /dev/null +++ b/website/static/img/blog/2024-06-12-putting-your-dag-on-the-internet/.gitkeep @@ -0,0 +1 @@ + diff --git a/website/static/img/blog/2024-06-12-putting-your-dag-on-the-internet/image1.png b/website/static/img/blog/2024-06-12-putting-your-dag-on-the-internet/image1.png new file mode 100644 index 00000000000..4a1142fb497 Binary files /dev/null and b/website/static/img/blog/2024-06-12-putting-your-dag-on-the-internet/image1.png differ diff --git a/website/static/img/blog/authors/ernesto-ongaro.png b/website/static/img/blog/authors/ernesto-ongaro.png new file mode 100644 index 00000000000..b7bac595442 Binary files /dev/null and b/website/static/img/blog/authors/ernesto-ongaro.png differ diff --git a/website/static/img/blog/authors/filip-eqt.png b/website/static/img/blog/authors/filip-eqt.png new file mode 100644 index 00000000000..172f5454d7b Binary files /dev/null and b/website/static/img/blog/authors/filip-eqt.png differ diff --git a/website/static/img/blog/authors/sebastian-eqt.png b/website/static/img/blog/authors/sebastian-eqt.png new file mode 100644 index 00000000000..eec385e9db3 Binary files /dev/null and b/website/static/img/blog/authors/sebastian-eqt.png differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/explorer-from-ide.jpg b/website/static/img/docs/collaborate/dbt-explorer/explorer-from-ide.jpg new file mode 100644 index 00000000000..a9176c13675 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/explorer-from-ide.jpg differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/explorer-from-lineage.gif b/website/static/img/docs/collaborate/dbt-explorer/explorer-from-lineage.gif new file mode 100644 index 00000000000..65449141fb3 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/explorer-from-lineage.gif differ diff --git a/website/static/img/docs/collaborate/dbt-explorer/explorer-from-model-timing.jpg b/website/static/img/docs/collaborate/dbt-explorer/explorer-from-model-timing.jpg new file mode 100644 index 00000000000..bfb0a006340 Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/explorer-from-model-timing.jpg differ diff --git a/website/static/img/docs/dbt-cloud/access-explorer.gif b/website/static/img/docs/dbt-cloud/access-explorer.gif new file mode 100644 index 00000000000..5eaf04e4170 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/access-explorer.gif differ diff --git a/website/static/img/docs/dbt-cloud/add-permissions.png b/website/static/img/docs/dbt-cloud/add-permissions.png new file mode 100644 index 00000000000..40bf518f005 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/add-permissions.png differ diff --git a/website/static/img/docs/dbt-cloud/dbt-docs-generate-command.png b/website/static/img/docs/dbt-cloud/dbt-docs-generate-command.png deleted file mode 100644 index 0ec0cae7d39..00000000000 Binary files a/website/static/img/docs/dbt-cloud/dbt-docs-generate-command.png and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/discovery-api/dbt-dag.jpg b/website/static/img/docs/dbt-cloud/discovery-api/dbt-dag.jpg deleted file mode 100644 index df05049d2a1..00000000000 Binary files a/website/static/img/docs/dbt-cloud/discovery-api/dbt-dag.jpg and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/environment-options.png b/website/static/img/docs/dbt-cloud/environment-options.png new file mode 100644 index 00000000000..0d1a342e60b Binary files /dev/null and b/website/static/img/docs/dbt-cloud/environment-options.png differ diff --git a/website/static/img/docs/dbt-cloud/explore-icon.jpg b/website/static/img/docs/dbt-cloud/explore-icon.jpg new file mode 100644 index 00000000000..d4912ecface Binary files /dev/null and b/website/static/img/docs/dbt-cloud/explore-icon.jpg differ diff --git a/website/static/img/docs/dbt-cloud/explore-nav.jpg b/website/static/img/docs/dbt-cloud/explore-nav.jpg new file mode 100644 index 00000000000..c0d26e58c56 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/explore-nav.jpg differ diff --git a/website/static/img/docs/dbt-cloud/groups-and-licenses.png b/website/static/img/docs/dbt-cloud/groups-and-licenses.png new file mode 100644 index 00000000000..93f38d2b165 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/groups-and-licenses.png differ diff --git a/website/static/img/docs/dbt-cloud/no-option.png b/website/static/img/docs/dbt-cloud/no-option.png new file mode 100644 index 00000000000..d540ad96e0f Binary files /dev/null and b/website/static/img/docs/dbt-cloud/no-option.png differ diff --git a/website/static/img/docs/dbt-cloud/read-only-access.png b/website/static/img/docs/dbt-cloud/read-only-access.png new file mode 100644 index 00000000000..226adf57c88 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/read-only-access.png differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/98c05c5-Screen_Shot_2019-02-08_at_9.18.22_PM.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/98c05c5-Screen_Shot_2019-02-08_at_9.18.22_PM.png deleted file mode 100644 index 25298f29fb3..00000000000 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/98c05c5-Screen_Shot_2019-02-08_at_9.18.22_PM.png and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png index 2c02b5e7bba..be7a96f7177 100644 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png index 8741e3a66a2..8d7ab68646b 100644 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png differ diff --git a/website/static/img/docs/dbt-cloud/write-access.png b/website/static/img/docs/dbt-cloud/write-access.png new file mode 100644 index 00000000000..a2512219280 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/write-access.png differ diff --git a/website/static/img/docs/running-a-dbt-project/dbt_cloud_orchestra_trigger.png b/website/static/img/docs/running-a-dbt-project/dbt_cloud_orchestra_trigger.png new file mode 100644 index 00000000000..032ae0fc28f Binary files /dev/null and b/website/static/img/docs/running-a-dbt-project/dbt_cloud_orchestra_trigger.png differ diff --git a/website/static/img/docs/running-a-dbt-project/orchestra_lineage_dbt_cloud.png b/website/static/img/docs/running-a-dbt-project/orchestra_lineage_dbt_cloud.png new file mode 100644 index 00000000000..615d0d4aa21 Binary files /dev/null and b/website/static/img/docs/running-a-dbt-project/orchestra_lineage_dbt_cloud.png differ diff --git a/website/static/img/repository-details-faq.jpg b/website/static/img/repository-details-faq.jpg new file mode 100644 index 00000000000..5e3c7f28f59 Binary files /dev/null and b/website/static/img/repository-details-faq.jpg differ diff --git a/website/vercel.json b/website/vercel.json index d0660bb3dad..7935f0dbaca 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,16 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/docs/collaborate/cloud-build-and-view-your-docs", + "destination": "/docs/collaborate/build-and-view-your-docs", + "permanent": true + }, + { + "source": "/docs/collaborate/documentation", + "destination":"/docs/build/documentation", + "permanent": true + }, { "source": "/docs/use-dbt-semantic-layer/tableau", "destination": "/docs/cloud-integrations/semantic-layer/tableau", @@ -1318,7 +1328,7 @@ }, { "source": "/docs/dbt-cloud/using-dbt-cloud/cloud-generating-documentation", - "destination": "/docs/collaborate/build-and-view-your-docs", + "destination": "/docs/collaborate/explore-projects", "permanent": true }, {