-
Notifications
You must be signed in to change notification settings - Fork 960
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update dbt-databricks docs #3946
Conversation
✅ Deploy Preview for docs-getdbt-com ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
70cf9b7
to
07779b7
Compare
@@ -56,9 +54,51 @@ pip is the easiest way to install the adapter: | |||
- Use of Delta Lake for all models out of the box | |||
- SQL macros that are optimized to run with [Photon](https://docs.databricks.com/runtime/photon.html) | |||
|
|||
### Set up a Databricks Target | |||
### Connecting to Databricks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i would keep this as an h2 for consistency
### Connecting to Databricks | |
## Connecting to Databricks |
### Set up a Databricks Target | ||
### Connecting to Databricks | ||
|
||
To connect to a data platform with dbt Core, create appropriate _profile_ and _target_ YAML keys/values in the `profiles.yml` configuration file for your Starburst/Trino clusters. This dbt YAML file lives in the `.dbt/` directory of your user/home directory. For more information, refer to [Connection profiles](/docs/core/connect-data-platform/connection-profiles) and [profiles.yml](/docs/core/connect-data-platform/profiles.yml). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ericbaumann this references starburst and trino clusters -- is that correct?
To connect to a data platform with dbt Core, create appropriate _profile_ and _target_ YAML keys/values in the `profiles.yml` configuration file for your Starburst/Trino clusters. This dbt YAML file lives in the `.dbt/` directory of your user/home directory. For more information, refer to [Connection profiles](/docs/core/connect-data-platform/connection-profiles) and [profiles.yml](/docs/core/connect-data-platform/profiles.yml). | |
To connect to a data platform with dbt Core, create the appropriate _profile_ and _target_ YAML keys/values in the `profiles.yml` configuration file for your Starburst/Trino clusters. This dbt YAML file lives in the `.dbt/` directory of your user/home directory. For more information, refer to [Connection profiles](/docs/core/connect-data-platform/connection-profiles) and [profiles.yml](/docs/core/connect-data-platform/profiles.yml). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nope! good catch 👀
website/docs/docs/core/connect-data-platform/databricks-setup.md
Outdated
Show resolved
Hide resolved
|
||
The following profile fields are always required. | ||
|
||
| Field | Example | Description | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ericbaumann not sure if it's just me but i wonder if it would be easier to have the columns set as 'field', 'description', and 'example'. I'm finding myself scanning to the description first before example so i can understand the 'thing'. happy to change this up for you if you agree
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that's a great idea! Feel like that will decrease the cognitive load we put on the reader. Thank you @mirnawong1 !!
|
||
dbt-databricks can connect to the Databricks SQL Warehouses and all-purpose clusters. Databricks SQL Warehouses is the recommended way to get started with Databricks. | ||
#### Example profiles.yml for token-based authentication |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'm wondering if this should be an h3 or h2 header? i also like the idea of using tabs so the user can switch between the two without much scrolling. if that makes sense, I'm happy to change this into tabs and share the preview.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if this is what you're referring to or not, but I was kinda questioning the side nav hierarchy after I saw the deploy preview. I feel like these should maybe be represented on there in some capacity (even if it's one link to the section for example profiles
- totally defer to your judgment though.
I was also kind of wondering if it'd be appropriate to nest the '[...] Parameters' sections underneath the 'Configuring dbt-databricks' section. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea great shout! i'm going to make some edits if you're ok with that? and will share the deploy
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sounds great, thanks mirna!!
…s/docs.getdbt.com into ebaumann/databricks-docs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🚢
What are you changing in this pull request and why?
I am changing the layout of the dbt-databricks documentation to make it more clear what the available connection parameters are! These will be able to be utilized by folks using both core and cloud. For extended attributes, we link to these docs.
Checklist