-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for viewer-based credentials for Databricks & Cortex #183
base: main
Are you sure you want to change the base?
Conversation
This commit brings support for Posit Connect's "viewer-based credentials" feature [0] to the Databricks and Cortex chatbots. Similar to the recent work in `odbc` [1], the way this is exposed to R users is to require them to pass a Shiny session argument to `chat_databricks()` or `chat_cortex()`. Checks for viewer-based credentials are designed to fall back gracefully to existing authentication methods. This is intended to allow users to -- for example -- develop and test a Shiny app that uses Databricks or Snowflake credentials in desktop Positron/RStudio or Posit Workbench and deploy it with no code changes to Connect. Most of the actual work is outsourced to a new shared package, `connectcreds` [2]. Note that this commit also brings the internal auth handling for Snowflake much closer to Databricks, notably by making `cortex_credentials()` internal and analogous to `databricks_token()`. Unit tests are included. [0]: https://docs.posit.co/connect/user/oauth-integrations/ [1]: r-dbi/odbc#853 [2]: https://github.com/posit-dev/connectcreds/ Signed-off-by: Aaron Jacobs <[email protected]>
@@ -20,6 +20,16 @@ NULL | |||
#' previous messages. Nor does it support registering tools, and attempting to | |||
#' do so will result in an error. | |||
#' | |||
#' `chat_cortex()` picks up the following ambient Snowflake credentials: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if it's worth having a standard ## Auth
header?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Happy to do that.
if (!is.null(session)) { | ||
check_installed("connectcreds", "for viewer-based authentication") | ||
if (!connectcreds::has_viewer_token(session, snowflake_url(account))) { | ||
session <- NULL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this just null session out with no warning/error?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
has_viewer_token()
emits a standard warning. I don't love this design, since I think it's non-obvious for the caller, but it does ensure that the warning is consistent across packages and we can limit how often it is shown in one place.
Does that approach make sense to you? Or should I add some sort of connectcreds::show_ignore_message()
API for this and have has_viewer_token()
keep quiet?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about something more like session <- connectcreds::update_session_token(...)
? (and drop the if)
This commit brings support for Posit Connect's "viewer-based credentials" feature to the Databricks and Cortex chatbots. Similar to the recent work in
odbc
, the way this is exposed to R users is to require them to pass a Shiny session argument tochat_databricks()
orchat_cortex()
.Checks for viewer-based credentials are designed to fall back gracefully to existing authentication methods. This is intended to allow users to -- for example -- develop and test a Shiny app that uses Databricks or Snowflake credentials in desktop Positron/RStudio or Posit Workbench and deploy it with no code changes to Connect.
Most of the actual work is outsourced to a new shared package,
connectcreds
.Note that this commit also brings the internal auth handling for Snowflake much closer to Databricks, notably by making
cortex_credentials()
internal and analogous todatabricks_token()
.Unit tests are included.