-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better error message when private link enabled workspaces reject requests #924
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
hectorcast-db
approved these changes
May 17, 2024
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #924 +/- ##
========================================
+ Coverage 7.12% 7.14% +0.02%
========================================
Files 279 281 +2
Lines 64446 64475 +29
========================================
+ Hits 4590 4609 +19
- Misses 59546 59558 +12
+ Partials 310 308 -2 ☔ View full report in Codecov by Sentry. |
github-merge-queue bot
pushed a commit
to databricks/databricks-sdk-py
that referenced
this pull request
May 17, 2024
…ests (#647) ## Changes This PR ports databricks/databricks-sdk-go#924 to the Python SDK. When a user tries to access a Private Link-enabled workspace configured with no public internet access from a different network than the VPC endpoint belongs to, the Private Link backend redirects the user to the login page, rather than outright rejecting the request. The login page, however, is not a JSON document and cannot be parsed by the SDK, resulting in this error message: ``` $ databricks current-user me Error: unexpected error handling request: invalid character '<' looking for beginning of value. This is likely a bug in the Databricks SDK for Go or the underlying REST API. Please report this issue with the following debugging information to the SDK issue tracker at https://github.com/databricks/databricks-sdk-go/issues. Request log: GET /login.html?error=private-link-validation-error:<WSID> > * Host: > * Accept: application/json > * Authorization: REDACTED > * Referer: https://adb-<WSID>.azuredatabricks.net/api/2.0/preview/scim/v2/Me > * User-Agent: cli/0.0.0-dev+5ed10bb8ccc1 databricks-sdk-go/0.39.0 go/1.22.2 os/darwin cmd/current-user_me auth/pat < HTTP/2.0 200 OK < * Cache-Control: no-cache, no-store, must-revalidate < * Content-Security-Policy: default-src *; font-src * data:; frame-src * blob:; img-src * blob: data:; media-src * data:; object-src 'none'; style-src * 'unsafe-inline'; worker-src * blob:; script-src 'self' 'unsafe-eval' 'unsafe-hashes' 'report-sample' https://*.databricks.com https://databricks.github.io/debug-bookmarklet/ https://widget.intercom.io https://js.intercomcdn.com https://ajax.googleapis.com/ajax/libs/jquery/2.2.0/jquery.min.js https://databricks-ui-assets.azureedge.net https://ui-serving-cdn-testing.azureedge.net https://uiserviceprodwestus-cdn-endpoint.azureedge.net https://databricks-ui-infra.s3.us-west-2.amazonaws.com 'sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=' 'sha256-YOlue469P2BtTMZYUFLupA2aOUsgc6B/TDewH7/Qz7s=' 'sha256-Lh4yp7cr3YOJ3MOn6erNz3E3WI0JA20mWV+0RuuviFM=' 'sha256-0jMhpY6PB/BTRDLWtfcjdwiHOpE+6rFk3ohvY6fbuHU='; report-uri /ui-csp-reports; frame-ancestors *.vocareum.com *.docebosaas.com *.edx.org *.deloitte.com *.cloudlabs.ai *.databricks.com *.myteksi.net < * Content-Type: text/html; charset=utf-8 < * Date: Fri, 17 May 2024 07:47:38 GMT < * Server: databricks < * Set-Cookie: enable-armeria-workspace-server-for-ui-flags=false; Max-Age=1800; Expires=Fri, 17 May 2024 08:17:38 GMT; Secure; HTTPOnly; SameSite=Strict < * Strict-Transport-Security: max-age=31536000; includeSubDomains; preload < * X-Content-Type-Options: nosniff < * X-Ui-Svc: true < * X-Xss-Protection: 1; mode=block < <!doctype html> < <html> < <head> < <meta charset="utf-8"> < <meta http-equiv="Content-Language" content="en"> < <title>Databricks - Sign In</title> < <meta name="viewport" content="width=960"> < <link rel="icon" type="image/png" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <meta http-equiv="content-type" content="text/html; charset=UTF8"> < <script id="__databricks_react_script"></script> < <script>window.__DATABRICKS_SAFE_FLAGS__={"databricks.infra.showErrorModalOnFetchError":true,"databricks.fe.infra.useReact18":true,"databricks.fe.infra.useReact18NewAPI":false,"databricks.fe.infra.fixConfigPrefetch":true},window.__DATABRICKS_CONFIG__={"publicPath":{"mlflow":"https://databricks-ui-assets.azureedge.net/","dbsql":"https://databricks-ui-assets.azureedge.net/","feature-store":"https://databricks-ui-assets.azureedge.net/","monolith":"https://databricks-ui-assets.azureedge.net/","jaws":"https://databricks-ui-assets.azureedge.net/"}}</script> < <link rel="icon" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <script> < function setNoCdnAndReload() { < document.cookie = `x-databricks-cdn-inaccessible=true; path=/; max-age=86400`; < const metric = 'cdnFallbackOccurred'; < const browserUserAgent = navigator.userAgent; < const browserTabId = window.browserTabId; < const performanceEntry = performance.getEntriesByType('resource').filter(e => e.initiatorType === 'script').slice(-1)[0] < sessionStorage.setItem('databricks-cdn-fallback-telemetry-key', JSON.stringify({ tags: { browserUserAgent, browserTabId }, performanceEntry})); < window.location.reload(); < } < </script> < <script> < // Set a manual timeout for dropped packets to CDN < function loadScriptWithTimeout(src, timeout) { < return new Promise((resolve, reject) => { < const script = document.createElement('script'); < script.defer = true; < script.src = src; < script.onload = resolve; < script.onerror = reject; < document.head.appendChild(script); < setTimeout(() => { < reject(new Error('Script load timeout')); < }, timeout); < }); < } < loadScriptWithTimeout('https://databricks-ui-assets.azureedge.net/static/js/login/login.8a983ca2.js', 10000).catch(setNoCdnAndReload); < </script> < </head> < <body class="light-mode"> < <uses-legacy-bootstrap> < <div id="login-page"></div> < </uses-legacy-bootstrap> < </body> < </html> ``` To address this, I add one additional check in the error mapper logic to inspect whether the user was redirected to the login page with the private link validation error response code. If so, we return a custom error, `PrivateLinkValidationError`, with error code `PRIVATE_LINK_VALIDATION_ERROR` that inherits from PermissionDenied and has a mock 403 status code. After this change, users will see an error message like this: ``` databricks.sdk.errors.private_link.PrivateLinkValidationError: The requested workspace has Azure Private Link enabled and is not accessible from the current network. Ensure that Azure Private Link is properly configured and that your device has access to the Azure Private Link endpoint. For more information, see https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#authentication-troubleshooting. ``` The error message is tuned to the specific cloud so that we can redirect users to the appropriate documentation, the cloud being inferred from the request URI. ## Tests Unit tests cover the private link error message mapping. To manually test this, I created a private link workspace in Azure, created an access token, restricted access to the workspace, then ran the `last_job_runs.py` example using the host & token: ``` /Users/miles/databricks-cli/.venv/bin/python /Users/miles/databricks-sdk-py/examples/last_job_runs.py 2024-05-17 11:43:32,529 [databricks.sdk][INFO] loading DEFAULT profile from ~/.databrickscfg: host, token Traceback (most recent call last): File "/Users/miles/databricks-sdk-py/examples/last_job_runs.py", line 20, in <module> for job in w.jobs.list(): File "/Users/miles/databricks-sdk-py/databricks/sdk/service/jobs.py", line 5453, in list json = self._api.do('GET', '/api/2.1/jobs/list', query=query, headers=headers) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/miles/databricks-sdk-py/databricks/sdk/core.py", line 131, in do response = retryable(self._perform)(method, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/miles/databricks-sdk-py/databricks/sdk/retries.py", line 54, in wrapper raise err File "/Users/miles/databricks-sdk-py/databricks/sdk/retries.py", line 33, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/miles/databricks-sdk-py/databricks/sdk/core.py", line 245, in _perform raise self._make_nicer_error(response=response) from None databricks.sdk.errors.private_link.PrivateLinkValidationError: The requested workspace has Azure Private Link enabled and is not accessible from the current network. Ensure that Azure Private Link is properly configured and that your device has access to the Azure Private Link endpoint. For more information, see https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#authentication-troubleshooting. ``` - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied
github-merge-queue bot
pushed a commit
to databricks/databricks-sdk-java
that referenced
this pull request
May 17, 2024
…ests (#290) ## Changes Port of databricks/databricks-sdk-go#924 to the Java SDK. When a user tries to access a Private Link-enabled workspace configured with no public internet access from a different network than the VPC endpoint belongs to, the Private Link backend redirects the user to the login page, rather than outright rejecting the request. The login page, however, is not a JSON document and cannot be parsed by the SDK, resulting in this error message: ``` $ databricks current-user me Error: unexpected error handling request: invalid character '<' looking for beginning of value. This is likely a bug in the Databricks SDK for Go or the underlying REST API. Please report this issue with the following debugging information to the SDK issue tracker at https://github.com/databricks/databricks-sdk-go/issues. Request log: GET /login.html?error=private-link-validation-error:<WSID> > * Host: > * Accept: application/json > * Authorization: REDACTED > * Referer: https://adb-<WSID>.azuredatabricks.net/api/2.0/preview/scim/v2/Me > * User-Agent: cli/0.0.0-dev+5ed10bb8ccc1 databricks-sdk-go/0.39.0 go/1.22.2 os/darwin cmd/current-user_me auth/pat < HTTP/2.0 200 OK < * Cache-Control: no-cache, no-store, must-revalidate < * Content-Security-Policy: default-src *; font-src * data:; frame-src * blob:; img-src * blob: data:; media-src * data:; object-src 'none'; style-src * 'unsafe-inline'; worker-src * blob:; script-src 'self' 'unsafe-eval' 'unsafe-hashes' 'report-sample' https://*.databricks.com https://databricks.github.io/debug-bookmarklet/ https://widget.intercom.io https://js.intercomcdn.com https://ajax.googleapis.com/ajax/libs/jquery/2.2.0/jquery.min.js https://databricks-ui-assets.azureedge.net https://ui-serving-cdn-testing.azureedge.net https://uiserviceprodwestus-cdn-endpoint.azureedge.net https://databricks-ui-infra.s3.us-west-2.amazonaws.com 'sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=' 'sha256-YOlue469P2BtTMZYUFLupA2aOUsgc6B/TDewH7/Qz7s=' 'sha256-Lh4yp7cr3YOJ3MOn6erNz3E3WI0JA20mWV+0RuuviFM=' 'sha256-0jMhpY6PB/BTRDLWtfcjdwiHOpE+6rFk3ohvY6fbuHU='; report-uri /ui-csp-reports; frame-ancestors *.vocareum.com *.docebosaas.com *.edx.org *.deloitte.com *.cloudlabs.ai *.databricks.com *.myteksi.net < * Content-Type: text/html; charset=utf-8 < * Date: Fri, 17 May 2024 07:47:38 GMT < * Server: databricks < * Set-Cookie: enable-armeria-workspace-server-for-ui-flags=false; Max-Age=1800; Expires=Fri, 17 May 2024 08:17:38 GMT; Secure; HTTPOnly; SameSite=Strict < * Strict-Transport-Security: max-age=31536000; includeSubDomains; preload < * X-Content-Type-Options: nosniff < * X-Ui-Svc: true < * X-Xss-Protection: 1; mode=block < <!doctype html> < <html> < <head> < <meta charset="utf-8"> < <meta http-equiv="Content-Language" content="en"> < <title>Databricks - Sign In</title> < <meta name="viewport" content="width=960"> < <link rel="icon" type="image/png" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <meta http-equiv="content-type" content="text/html; charset=UTF8"> < <script id="__databricks_react_script"></script> < <script>window.__DATABRICKS_SAFE_FLAGS__={"databricks.infra.showErrorModalOnFetchError":true,"databricks.fe.infra.useReact18":true,"databricks.fe.infra.useReact18NewAPI":false,"databricks.fe.infra.fixConfigPrefetch":true},window.__DATABRICKS_CONFIG__={"publicPath":{"mlflow":"https://databricks-ui-assets.azureedge.net/","dbsql":"https://databricks-ui-assets.azureedge.net/","feature-store":"https://databricks-ui-assets.azureedge.net/","monolith":"https://databricks-ui-assets.azureedge.net/","jaws":"https://databricks-ui-assets.azureedge.net/"}}</script> < <link rel="icon" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <script> < function setNoCdnAndReload() { < document.cookie = `x-databricks-cdn-inaccessible=true; path=/; max-age=86400`; < const metric = 'cdnFallbackOccurred'; < const browserUserAgent = navigator.userAgent; < const browserTabId = window.browserTabId; < const performanceEntry = performance.getEntriesByType('resource').filter(e => e.initiatorType === 'script').slice(-1)[0] < sessionStorage.setItem('databricks-cdn-fallback-telemetry-key', JSON.stringify({ tags: { browserUserAgent, browserTabId }, performanceEntry})); < window.location.reload(); < } < </script> < <script> < // Set a manual timeout for dropped packets to CDN < function loadScriptWithTimeout(src, timeout) { < return new Promise((resolve, reject) => { < const script = document.createElement('script'); < script.defer = true; < script.src = src; < script.onload = resolve; < script.onerror = reject; < document.head.appendChild(script); < setTimeout(() => { < reject(new Error('Script load timeout')); < }, timeout); < }); < } < loadScriptWithTimeout('https://databricks-ui-assets.azureedge.net/static/js/login/login.8a983ca2.js', 10000).catch(setNoCdnAndReload); < </script> < </head> < <body class="light-mode"> < <uses-legacy-bootstrap> < <div id="login-page"></div> < </uses-legacy-bootstrap> < </body> < </html> ``` To address this, I add one additional check in the error mapper logic to inspect whether the user was redirected to the login page with the private link validation error response code. If so, we return a synthetic error with error code `PRIVATE_LINK_VALIDATION_ERROR` that inherits from PermissionDenied and has a mock 403 status code. After this change, users will see an error message like this: ``` Exception in thread "main" com.databricks.sdk.core.error.PrivateLinkValidationError: The requested workspace has Azure Private Link enabled and is not accessible from the current network. Ensure that Azure Private Link is properly configured and that your device has access to the Azure Private Link endpoint. For more information, see https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#authentication-troubleshooting. at com.databricks.sdk.core.error.PrivateLinkInfo.createPrivateLinkValidationError(PrivateLinkInfo.java:63) at com.databricks.sdk.core.error.AbstractErrorMapper.apply(AbstractErrorMapper.java:46) at com.databricks.sdk.core.error.ApiErrors.getDatabricksError(ApiErrors.java:29) at com.databricks.sdk.core.ApiClient.executeInner(ApiClient.java:276) at com.databricks.sdk.core.ApiClient.getResponse(ApiClient.java:235) at com.databricks.sdk.core.ApiClient.execute(ApiClient.java:227) at com.databricks.sdk.core.ApiClient.GET(ApiClient.java:148) at com.databricks.sdk.service.compute.ClustersImpl.list(ClustersImpl.java:94) at com.databricks.sdk.support.Paginator.flipNextPage(Paginator.java:58) at com.databricks.sdk.support.Paginator.<init>(Paginator.java:51) at com.databricks.sdk.service.compute.ClustersAPI.list(ClustersAPI.java:295) at com.databricks.sdk.examples.ListClustersExample.main(ListClustersExample.java:11) ``` The error message is tuned to the specific cloud so that we can redirect users to the appropriate documentation, the cloud being inferred from the request URI. ## Tests Unit tests cover the private link error message mapping. To manually test this, I created a private link workspace in Azure, created an access token, restricted access to the workspace, then ran the `ListClustersExample` example using the host & token: ``` <SNIP> 14:41 [DEBUG] > GET /api/2.0/clusters/list < 200 OK < <!doctype html> < <html> < <head> < <meta charset="utf-8"> < <meta http-equiv="Content-Language" content="en"> < <title>Databricks - Sign In</title> < <meta name="viewport" content="width=960"> < <link rel="icon" type="image/png" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <meta http-equiv="content-type" content="text/html; charset=UTF8"> < <script id="__databricks_react_script"></script> < <script>window.__DATABRICKS_SAFE_FLAGS__={"databricks.infra.showErrorModalOnFetchError":true,"databricks.fe.infra.useReact18":true,"databricks.fe.infra.useReact18NewAPI":false,"databricks.fe.infra.fixConfigPrefetch":true},window.__DATABRICKS_CONFIG__={"publicPath":{"mlflow":"https://databricks-ui-assets.azureedge.net/","dbsql":"https://databricks-ui-assets.azureedge.net/","feature-store":"https://databricks-ui-assets.azureedge.net/","monolith":"https://databricks-ui-assets.azureedge.net/","jaws":"https://databricks-ui-assets.azureedge.net/"}}</script> < <link rel="icon" href="https://databricks-ui-assets.... (1420 more bytes) Exception in thread "main" com.databricks.sdk.core.error.PrivateLinkValidationError: The requested workspace has Azure Private Link enabled and is not accessible from the current network. Ensure that Azure Private Link is properly configured and that your device has access to the Azure Private Link endpoint. For more information, see https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#authentication-troubleshooting. at com.databricks.sdk.core.error.PrivateLinkInfo.createPrivateLinkValidationError(PrivateLinkInfo.java:63) at com.databricks.sdk.core.error.AbstractErrorMapper.apply(AbstractErrorMapper.java:46) at com.databricks.sdk.core.error.ApiErrors.getDatabricksError(ApiErrors.java:29) at com.databricks.sdk.core.ApiClient.executeInner(ApiClient.java:276) at com.databricks.sdk.core.ApiClient.getResponse(ApiClient.java:235) at com.databricks.sdk.core.ApiClient.execute(ApiClient.java:227) at com.databricks.sdk.core.ApiClient.GET(ApiClient.java:148) at com.databricks.sdk.service.compute.ClustersImpl.list(ClustersImpl.java:94) at com.databricks.sdk.support.Paginator.flipNextPage(Paginator.java:58) at com.databricks.sdk.support.Paginator.<init>(Paginator.java:51) at com.databricks.sdk.service.compute.ClustersAPI.list(ClustersAPI.java:295) at com.databricks.sdk.examples.ListClustersExample.main(ListClustersExample.java:11) Disconnected from the target VM, address: '127.0.0.1:55559', transport: 'socket' Process finished with exit code 1 exit status 2 ```
hectorcast-db
added a commit
that referenced
this pull request
May 21, 2024
## Backward incompatible changes * `CredentialsProvider` has been renamed to `CredentialsStrategy`. Services using type check on such resources must update their code. ## Improvements and new features * Create a method to generate OAuth tokens ([#886](#886)). * Better error message when private link enabled workspaces reject requests ([#924](#924)). * Update OpenAPI spec ([#926](#926)). API Changes: * Changed `List` method for [w.Connections](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ConnectionsAPI) workspace-level service to require request of [catalog.ListConnectionsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListConnectionsRequest). * Renamed [w.LakehouseMonitors](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI) workspace-level service to [w.QualityMonitors](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QualityMonitorsAPI). * Renamed [catalog.DeleteLakehouseMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteLakehouseMonitorRequest). * Changed `SchemaName` field for [catalog.DisableRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DisableRequest) to `string`. * Removed [catalog.DisableSchemaName](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DisableSchemaName) to [catalog.DeleteQualityMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteQualityMonitorRequest). * Changed `SchemaName` field for [catalog.EnableRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#EnableRequest) to `string`. * Removed [catalog.EnableSchemaName](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#EnableSchemaName). * Renamed [catalog.GetLakehouseMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetLakehouseMonitorRequest) to [catalog.GetQualityMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQualityMonitorRequest). * Added `NextPageToken` field for [catalog.ListConnectionsResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListConnectionsResponse). * Added `DashboardId` field for [catalog.UpdateMonitor](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor). * Added [catalog.ListConnectionsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListConnectionsRequest). * Added [catalog.MonitorRefreshListResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MonitorRefreshListResponse). * Changed `ClusterStatus` method for [w.Libraries](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI) workspace-level service to return [compute.ClusterLibraryStatuses](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLibraryStatuses). * Removed `ClusterSource` field for [compute.ClusterAttributes](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterAttributes). * Changed `Spec` field for [compute.ClusterDetails](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterDetails) to [compute.ClusterSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSpec). * Removed `CloneFrom` and `ClusterSource` fields for [compute.ClusterSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSpec). * Removed [compute.ClusterStatusResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse). * Removed `ClusterSource` field for [compute.CreateCluster](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#CreateCluster). * Removed `CloneFrom` and `ClusterSource` fields for [compute.EditCluster](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EditCluster). * Rename `SortBySpec` field to `SortBy` for [marketplace.ListListingsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ListListingsRequest). * Added `IsAscending` field for [marketplace.ListListingsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ListListingsRequest). * Added `IsAscending` field for [marketplace.SearchListingsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#SearchListingsRequest). * Removed [marketplace.SortBySpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#SortBySpec). * Removed [marketplace.SortOrder](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#SortOrder). * Added `GatewayDefinition` field for [pipelines.CreatePipeline](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline). * Added `GatewayDefinition` field for [pipelines.EditPipeline](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline). * Added `TableConfiguration` field for [pipelines.ManagedIngestionPipelineDefinition](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition). * Added `GatewayDefinition` field for [pipelines.PipelineSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec). * Added `TableConfiguration` field for [pipelines.SchemaSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec). * Added `TableConfiguration` field for [pipelines.TableSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec). * Added [pipelines.IngestionGatewayPipelineDefinition](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionGatewayPipelineDefinition). * Added [pipelines.TableSpecificConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig). * Added [pipelines.TableSpecificConfigScdType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfigScdType). * Added `DeploymentArtifacts` field for [serving.AppDeployment](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment). * Added `RouteOptimized` field for [serving.CreateServingEndpoint](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint). * Added `Contents` field for [serving.ExportMetricsResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExportMetricsResponse). * Changed `OpenaiApiKey` field for [serving.OpenAiConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#OpenAiConfig) to no longer be required. * Added `MicrosoftEntraClientId`, `MicrosoftEntraClientSecret` and `MicrosoftEntraTenantId` fields for [serving.OpenAiConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#OpenAiConfig). * Added `EndpointUrl` and `RouteOptimized` field for [serving.ServingEndpointDetailed](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointDetailed). * Added [serving.AppDeploymentArtifacts](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentArtifacts). * Added `StorageRoot` field for [sharing.CreateShare](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#CreateShare). * Added `StorageLocation` and `StorageRoot` field for [sharing.ShareInfo](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#ShareInfo). * Added `StorageRoot` field for [sharing.UpdateShare](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#UpdateShare). * Added `ScanIndex` method for [w.VectorSearchIndexes](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI) workspace-level service. * Added `EmbeddingWritebackTable` field for [vectorsearch.DeltaSyncVectorIndexSpecRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecRequest). * Added `EmbeddingWritebackTable` field for [vectorsearch.DeltaSyncVectorIndexSpecResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecResponse). * Added [vectorsearch.ListValue](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ListValue). * Added [vectorsearch.MapStringValueEntry](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#MapStringValueEntry). * Added [vectorsearch.ScanVectorIndexRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ScanVectorIndexRequest). * Added [vectorsearch.ScanVectorIndexResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ScanVectorIndexResponse). * Added [vectorsearch.Struct](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#Struct). * Added [vectorsearch.Value](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#Value). OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
Merged
github-merge-queue bot
pushed a commit
that referenced
this pull request
May 21, 2024
## Backward incompatible changes * Renamed `CredentialsProvider` to `CredentialsStrategy`. ## Improvements and new features * Create a method to generate OAuth tokens ([#886](#886)). * Better error message when private link enabled workspaces reject requests ([#924](#924)). * Update OpenAPI spec ([#926](#926)). ## API Changes: * Changed `List` method for [w.Connections](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ConnectionsAPI) workspace-level service to require request of [catalog.ListConnectionsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListConnectionsRequest). * Renamed [w.LakehouseMonitors](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI) workspace-level service to [w.QualityMonitors](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QualityMonitorsAPI). * Renamed [catalog.DeleteLakehouseMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteLakehouseMonitorRequest). * Changed `SchemaName` field for [catalog.DisableRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DisableRequest) to `string`. * Removed [catalog.DisableSchemaName](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DisableSchemaName) to [catalog.DeleteQualityMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteQualityMonitorRequest). * Changed `SchemaName` field for [catalog.EnableRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#EnableRequest) to `string`. * Removed [catalog.EnableSchemaName](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#EnableSchemaName). * Renamed [catalog.GetLakehouseMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetLakehouseMonitorRequest) to [catalog.GetQualityMonitorRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQualityMonitorRequest). * Added `NextPageToken` field for [catalog.ListConnectionsResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListConnectionsResponse). * Added `DashboardId` field for [catalog.UpdateMonitor](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor). * Added [catalog.ListConnectionsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListConnectionsRequest). * Added [catalog.MonitorRefreshListResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#MonitorRefreshListResponse). * Changed `ClusterStatus` method for [w.Libraries](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI) workspace-level service to return [compute.ClusterLibraryStatuses](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLibraryStatuses). * Removed `ClusterSource` field for [compute.ClusterAttributes](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterAttributes). * Changed `Spec` field for [compute.ClusterDetails](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterDetails) to [compute.ClusterSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSpec). * Removed `CloneFrom` and `ClusterSource` fields for [compute.ClusterSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSpec). * Removed [compute.ClusterStatusResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse). * Removed `ClusterSource` field for [compute.CreateCluster](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#CreateCluster). * Removed `CloneFrom` and `ClusterSource` fields for [compute.EditCluster](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EditCluster). * Rename `SortBySpec` field to `SortBy` for [marketplace.ListListingsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ListListingsRequest). * Added `IsAscending` field for [marketplace.ListListingsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ListListingsRequest). * Added `IsAscending` field for [marketplace.SearchListingsRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#SearchListingsRequest). * Removed [marketplace.SortBySpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#SortBySpec). * Removed [marketplace.SortOrder](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#SortOrder). * Added `GatewayDefinition` field for [pipelines.CreatePipeline](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline). * Added `GatewayDefinition` field for [pipelines.EditPipeline](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline). * Added `TableConfiguration` field for [pipelines.ManagedIngestionPipelineDefinition](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition). * Added `GatewayDefinition` field for [pipelines.PipelineSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec). * Added `TableConfiguration` field for [pipelines.SchemaSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec). * Added `TableConfiguration` field for [pipelines.TableSpec](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec). * Added [pipelines.IngestionGatewayPipelineDefinition](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionGatewayPipelineDefinition). * Added [pipelines.TableSpecificConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig). * Added [pipelines.TableSpecificConfigScdType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfigScdType). * Added `DeploymentArtifacts` field for [serving.AppDeployment](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment). * Added `RouteOptimized` field for [serving.CreateServingEndpoint](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint). * Added `Contents` field for [serving.ExportMetricsResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExportMetricsResponse). * Changed `OpenaiApiKey` field for [serving.OpenAiConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#OpenAiConfig) to no longer be required. * Added `MicrosoftEntraClientId`, `MicrosoftEntraClientSecret` and `MicrosoftEntraTenantId` fields for [serving.OpenAiConfig](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#OpenAiConfig). * Added `EndpointUrl` and `RouteOptimized` field for [serving.ServingEndpointDetailed](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointDetailed). * Added [serving.AppDeploymentArtifacts](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentArtifacts). * Added `StorageRoot` field for [sharing.CreateShare](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#CreateShare). * Added `StorageLocation` and `StorageRoot` field for [sharing.ShareInfo](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#ShareInfo). * Added `StorageRoot` field for [sharing.UpdateShare](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#UpdateShare). * Added `ScanIndex` method for [w.VectorSearchIndexes](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI) workspace-level service. * Added `EmbeddingWritebackTable` field for [vectorsearch.DeltaSyncVectorIndexSpecRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecRequest). * Added `EmbeddingWritebackTable` field for [vectorsearch.DeltaSyncVectorIndexSpecResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecResponse). * Added [vectorsearch.ListValue](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ListValue). * Added [vectorsearch.MapStringValueEntry](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#MapStringValueEntry). * Added [vectorsearch.ScanVectorIndexRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ScanVectorIndexRequest). * Added [vectorsearch.ScanVectorIndexResponse](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#ScanVectorIndexResponse). * Added [vectorsearch.Struct](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#Struct). * Added [vectorsearch.Value](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#Value). OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
When a user tries to access a Private Link-enabled workspace configured with no public internet access from a different network than the VPC endpoint belongs to, the Private Link backend redirects the user to the login page, rather than outright rejecting the request. The login page, however, is not a JSON document and cannot be parsed by the SDK, resulting in this error message:
To address this, I add one additional check in the error mapper logic to inspect whether the user was redirected to the login page with the private link validation error response code. If so, we return a synthetic error with error code
PRIVATE_LINK_VALIDATION_ERROR
that inherits from ErrPermissionsDenied and has a mock 403 status code.After this change, users will see an error message like this:
The error message is tuned to the specific cloud so that we can redirect users to the appropriate documentation, the cloud being inferred from the request URI.
As part of this, I made a small refactor of environments code into a separate package so it can be used by both the config and apierr packages.
Tests
Unit tests cover the private link error message mapping.
To manually test this, I created a private link workspace in Azure, created an access token, restricted access to the workspace, then ran the
default-auth
example using the host & token:make test
passingmake fmt
applied