From 2aa6521231cc3007109a3902f04989a74834bda5 Mon Sep 17 00:00:00 2001 From: Renaud Hartert Date: Mon, 20 Jan 2025 13:39:01 +0100 Subject: [PATCH] [Release] Release v0.41.0 ### New Features and Improvements * Add `serving.http_request` to call external functions. ([#857](https://github.com/databricks/databricks-sdk-py/pull/857)). * Files API client: recover on download failures ([#844](https://github.com/databricks/databricks-sdk-py/pull/844)) ([#845](https://github.com/databricks/databricks-sdk-py/pull/845)). ### Bug Fixes * Properly pass query parameters in apps and oauth2 ([#862](https://github.com/databricks/databricks-sdk-py/pull/862)). ### Internal Changes * Add unit tests for external-browser authentication ([#863](https://github.com/databricks/databricks-sdk-py/pull/863)). * Decouple oauth2 and serving ([#855](https://github.com/databricks/databricks-sdk-py/pull/855)). * Migrate workflows that need write access to use hosted runners ([#850](https://github.com/databricks/databricks-sdk-py/pull/850)). * Stop testing Python 3.7 on Ubuntu ([#858](https://github.com/databricks/databricks-sdk-py/pull/858)). ### API Changes: * Added [w.access_control](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/access_control.html) workspace-level service. * Added `http_request()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service. * Added `no_compute` field for `databricks.sdk.service.apps.CreateAppRequest`. * Added `has_more` field for `databricks.sdk.service.jobs.BaseJob`. * Added `has_more` field for `databricks.sdk.service.jobs.BaseRun`. * Added `page_token` field for `databricks.sdk.service.jobs.GetJobRequest`. * Added `has_more` and `next_page_token` fields for `databricks.sdk.service.jobs.Job`. * Added `has_more` field for `databricks.sdk.service.jobs.Run`. * Added `clean_rooms_notebook_output` field for `databricks.sdk.service.jobs.RunOutput`. * Added `scopes` field for `databricks.sdk.service.oauth2.UpdateCustomAppIntegration`. * Added `run_as` field for `databricks.sdk.service.pipelines.CreatePipeline`. * Added `run_as` field for `databricks.sdk.service.pipelines.EditPipeline`. * Added `authorization_details` and `endpoint_url` fields for `databricks.sdk.service.serving.DataPlaneInfo`. * Added `contents` field for `databricks.sdk.service.serving.GetOpenApiResponse`. * Added `activated`, `activation_url`, `authentication_type`, `cloud`, `comment`, `created_at`, `created_by`, `data_recipient_global_metastore_id`, `ip_access_list`, `metastore_id`, `name`, `owner`, `properties_kvpairs`, `region`, `sharing_code`, `tokens`, `updated_at` and `updated_by` fields for `databricks.sdk.service.sharing.RecipientInfo`. * Added `expiration_time` field for `databricks.sdk.service.sharing.RecipientInfo`. * Added . * Added . * Added , , and . * Added . * Added , , , and . * Changed `update()` method for [a.account_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_federation_policy.html) account-level service with new required argument order. * Changed `update()` method for [a.service_principal_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/service_principal_federation_policy.html) account-level service with new required argument order. * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service to return `databricks.sdk.service.sharing.RecipientInfo` dataclass. * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service return type to become non-empty. * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service to type `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service. * Changed `get_open_api()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service return type to become non-empty. * Changed `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to type `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service. * Changed `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to return `databricks.sdk.service.serving.EndpointTags` dataclass. * Changed `databricks.sdk.service.serving.EndpointTagList` dataclass to. * Changed `collaborator_alias` field for `databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be required. * Changed `collaborator_alias` field for `databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be required. * Changed `update_mask` field for `databricks.sdk.service.oauth2.UpdateAccountFederationPolicyRequest` to no longer be required. * Changed `update_mask` field for `databricks.sdk.service.oauth2.UpdateServicePrincipalFederationPolicyRequest` to no longer be required. * Changed `days_of_week` field for `databricks.sdk.service.pipelines.RestartWindow` to type `databricks.sdk.service.pipelines.DayOfWeekList` dataclass. * Changed `behavior` field for `databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no longer be required. * Changed `behavior` field for `databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no longer be required. * Changed `project_id` and `region` fields for `databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be required. * Changed `project_id` and `region` fields for `databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be required. * Changed `workload_type` field for `databricks.sdk.service.serving.ServedEntityInput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass. * Changed `workload_type` field for `databricks.sdk.service.serving.ServedEntityOutput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass. * Changed `workload_type` field for `databricks.sdk.service.serving.ServedModelOutput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass. * Changed . * Changed . OpenAPI SHA: 58905570a9928fc9ed31fba14a2edaf9a7c55b08, Date: 2025-01-20 --- .codegen/_openapi_sha | 2 +- CHANGELOG.md | 70 ++++++++ databricks/sdk/__init__.py | 13 +- databricks/sdk/service/compute.py | 4 + databricks/sdk/service/dashboards.py | 11 +- databricks/sdk/service/iam.py | 158 +++++++++++++++++++ databricks/sdk/service/sharing.py | 105 ++++++------ databricks/sdk/version.py | 2 +- docs/dbdataclasses/compute.rst | 12 ++ docs/dbdataclasses/dashboards.rst | 5 +- docs/dbdataclasses/iam.rst | 26 +++ docs/dbdataclasses/sharing.rst | 4 - docs/workspace/iam/access_control.rst | 23 +++ docs/workspace/iam/index.rst | 1 + docs/workspace/serving/serving_endpoints.rst | 17 +- docs/workspace/sharing/providers.rst | 6 +- docs/workspace/sharing/recipients.rst | 20 +-- 17 files changed, 393 insertions(+), 86 deletions(-) create mode 100644 docs/workspace/iam/access_control.rst diff --git a/.codegen/_openapi_sha b/.codegen/_openapi_sha index 431b7678a..cabc6cf48 100644 --- a/.codegen/_openapi_sha +++ b/.codegen/_openapi_sha @@ -1 +1 @@ -05a10af4ed43566968119b43605f0a7fecbe780f \ No newline at end of file +58905570a9928fc9ed31fba14a2edaf9a7c55b08 \ No newline at end of file diff --git a/CHANGELOG.md b/CHANGELOG.md index 4f7aa3cc2..d91d8de43 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,75 @@ # Version changelog +## [Release] Release v0.41.0 + +### New Features and Improvements + + * Add `serving.http_request` to call external functions. ([#857](https://github.com/databricks/databricks-sdk-py/pull/857)). + * Files API client: recover on download failures ([#844](https://github.com/databricks/databricks-sdk-py/pull/844)) ([#845](https://github.com/databricks/databricks-sdk-py/pull/845)). + + +### Bug Fixes + + * Properly pass query parameters in apps and oauth2 ([#862](https://github.com/databricks/databricks-sdk-py/pull/862)). + + +### Internal Changes + + * Add unit tests for external-browser authentication ([#863](https://github.com/databricks/databricks-sdk-py/pull/863)). + * Decouple oauth2 and serving ([#855](https://github.com/databricks/databricks-sdk-py/pull/855)). + * Migrate workflows that need write access to use hosted runners ([#850](https://github.com/databricks/databricks-sdk-py/pull/850)). + * Stop testing Python 3.7 on Ubuntu ([#858](https://github.com/databricks/databricks-sdk-py/pull/858)). + + +### API Changes: + + * Added [w.access_control](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/access_control.html) workspace-level service. + * Added `http_request()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service. + * Added `no_compute` field for `databricks.sdk.service.apps.CreateAppRequest`. + * Added `has_more` field for `databricks.sdk.service.jobs.BaseJob`. + * Added `has_more` field for `databricks.sdk.service.jobs.BaseRun`. + * Added `page_token` field for `databricks.sdk.service.jobs.GetJobRequest`. + * Added `has_more` and `next_page_token` fields for `databricks.sdk.service.jobs.Job`. + * Added `has_more` field for `databricks.sdk.service.jobs.Run`. + * Added `clean_rooms_notebook_output` field for `databricks.sdk.service.jobs.RunOutput`. + * Added `scopes` field for `databricks.sdk.service.oauth2.UpdateCustomAppIntegration`. + * Added `run_as` field for `databricks.sdk.service.pipelines.CreatePipeline`. + * Added `run_as` field for `databricks.sdk.service.pipelines.EditPipeline`. + * Added `authorization_details` and `endpoint_url` fields for `databricks.sdk.service.serving.DataPlaneInfo`. + * Added `contents` field for `databricks.sdk.service.serving.GetOpenApiResponse`. + * Added `activated`, `activation_url`, `authentication_type`, `cloud`, `comment`, `created_at`, `created_by`, `data_recipient_global_metastore_id`, `ip_access_list`, `metastore_id`, `name`, `owner`, `properties_kvpairs`, `region`, `sharing_code`, `tokens`, `updated_at` and `updated_by` fields for `databricks.sdk.service.sharing.RecipientInfo`. + * Added `expiration_time` field for `databricks.sdk.service.sharing.RecipientInfo`. + * Added . + * Added . + * Added , , and . + * Added . + * Added , , , and . + * Changed `update()` method for [a.account_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_federation_policy.html) account-level service with new required argument order. + * Changed `update()` method for [a.service_principal_federation_policy](https://databricks-sdk-py.readthedocs.io/en/latest/account/service_principal_federation_policy.html) account-level service with new required argument order. + * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service to return `databricks.sdk.service.sharing.RecipientInfo` dataclass. + * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service return type to become non-empty. + * Changed `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service to type `update()` method for [w.recipients](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/recipients.html) workspace-level service. + * Changed `get_open_api()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service return type to become non-empty. + * Changed `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to type `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service. + * Changed `patch()` method for [w.serving_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints.html) workspace-level service to return `databricks.sdk.service.serving.EndpointTags` dataclass. + * Changed `databricks.sdk.service.serving.EndpointTagList` dataclass to. + * Changed `collaborator_alias` field for `databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be required. + * Changed `collaborator_alias` field for `databricks.sdk.service.cleanrooms.CleanRoomCollaborator` to be required. + * Changed `update_mask` field for `databricks.sdk.service.oauth2.UpdateAccountFederationPolicyRequest` to no longer be required. + * Changed `update_mask` field for `databricks.sdk.service.oauth2.UpdateServicePrincipalFederationPolicyRequest` to no longer be required. + * Changed `days_of_week` field for `databricks.sdk.service.pipelines.RestartWindow` to type `databricks.sdk.service.pipelines.DayOfWeekList` dataclass. + * Changed `behavior` field for `databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no longer be required. + * Changed `behavior` field for `databricks.sdk.service.serving.AiGatewayGuardrailPiiBehavior` to no longer be required. + * Changed `project_id` and `region` fields for `databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be required. + * Changed `project_id` and `region` fields for `databricks.sdk.service.serving.GoogleCloudVertexAiConfig` to be required. + * Changed `workload_type` field for `databricks.sdk.service.serving.ServedEntityInput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass. + * Changed `workload_type` field for `databricks.sdk.service.serving.ServedEntityOutput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass. + * Changed `workload_type` field for `databricks.sdk.service.serving.ServedModelOutput` to type `databricks.sdk.service.serving.ServingModelWorkloadType` dataclass. + * Changed . + * Changed . + +OpenAPI SHA: 58905570a9928fc9ed31fba14a2edaf9a7c55b08, Date: 2025-01-20 + ## [Release] Release v0.40.0 ### API Changes: diff --git a/databricks/sdk/__init__.py b/databricks/sdk/__init__.py index 80fe188b8..c81eb626c 100755 --- a/databricks/sdk/__init__.py +++ b/databricks/sdk/__init__.py @@ -1,5 +1,6 @@ # Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. +import logging from typing import Optional import databricks.sdk.core as client @@ -42,7 +43,8 @@ PolicyFamiliesAPI) from databricks.sdk.service.dashboards import GenieAPI, LakeviewAPI from databricks.sdk.service.files import DbfsAPI, FilesAPI -from databricks.sdk.service.iam import (AccountAccessControlAPI, +from databricks.sdk.service.iam import (AccessControlAPI, + AccountAccessControlAPI, AccountAccessControlProxyAPI, AccountGroupsAPI, AccountServicePrincipalsAPI, @@ -99,6 +101,8 @@ from databricks.sdk.service.workspace import (GitCredentialsAPI, ReposAPI, SecretsAPI, WorkspaceAPI) +_LOG = logging.getLogger(__name__) + def _make_dbutils(config: client.Config): # We try to directly check if we are in runtime, instead of @@ -118,6 +122,7 @@ def _make_dbutils(config: client.Config): def _make_files_client(apiClient: client.ApiClient, config: client.Config): if config.enable_experimental_files_api_client: + _LOG.info("Experimental Files API client is enabled") return FilesExt(apiClient, config) else: return FilesAPI(apiClient) @@ -184,6 +189,7 @@ def __init__(self, self._dbutils = _make_dbutils(self._config) self._api_client = client.ApiClient(self._config) serving_endpoints = ServingEndpointsExt(self._api_client) + self._access_control = AccessControlAPI(self._api_client) self._account_access_control_proxy = AccountAccessControlProxyAPI(self._api_client) self._alerts = AlertsAPI(self._api_client) self._alerts_legacy = AlertsLegacyAPI(self._api_client) @@ -292,6 +298,11 @@ def api_client(self) -> client.ApiClient: def dbutils(self) -> dbutils.RemoteDbUtils: return self._dbutils + @property + def access_control(self) -> AccessControlAPI: + """Rule based Access Control for Databricks Resources.""" + return self._access_control + @property def account_access_control_proxy(self) -> AccountAccessControlProxyAPI: """These APIs manage access rules on resources in an account.""" diff --git a/databricks/sdk/service/compute.py b/databricks/sdk/service/compute.py index 0afdb6f19..53240b4ad 100755 --- a/databricks/sdk/service/compute.py +++ b/databricks/sdk/service/compute.py @@ -4184,6 +4184,10 @@ class EventDetailsCause(Enum): class EventType(Enum): + ADD_NODES_FAILED = 'ADD_NODES_FAILED' + AUTOMATIC_CLUSTER_UPDATE = 'AUTOMATIC_CLUSTER_UPDATE' + AUTOSCALING_BACKOFF = 'AUTOSCALING_BACKOFF' + AUTOSCALING_FAILED = 'AUTOSCALING_FAILED' AUTOSCALING_STATS_REPORT = 'AUTOSCALING_STATS_REPORT' CREATING = 'CREATING' DBFS_DOWN = 'DBFS_DOWN' diff --git a/databricks/sdk/service/dashboards.py b/databricks/sdk/service/dashboards.py index 34bd58995..221727230 100755 --- a/databricks/sdk/service/dashboards.py +++ b/databricks/sdk/service/dashboards.py @@ -381,8 +381,9 @@ class GenieMessage: status: Optional[MessageStatus] = None """MesssageStatus. The possible values are: * `FETCHING_METADATA`: Fetching metadata from the data sources. * `FILTERING_CONTEXT`: Running smart context step to determine relevant context. * - `ASKING_AI`: Waiting for the LLM to respond to the users question. * `EXECUTING_QUERY`: - Executing AI provided SQL query. Get the SQL query result by calling + `ASKING_AI`: Waiting for the LLM to respond to the users question. * `PENDING_WAREHOUSE`: + Waiting for warehouse before the SQL query can start executing. * `EXECUTING_QUERY`: Executing + AI provided SQL query. Get the SQL query result by calling [getMessageQueryResult](:method:genie/getMessageQueryResult) API. **Important: The message status will stay in the `EXECUTING_QUERY` until a client calls [getMessageQueryResult](:method:genie/getMessageQueryResult)**. * `FAILED`: Generating a @@ -678,8 +679,9 @@ class MessageErrorType(Enum): class MessageStatus(Enum): """MesssageStatus. The possible values are: * `FETCHING_METADATA`: Fetching metadata from the data sources. * `FILTERING_CONTEXT`: Running smart context step to determine relevant context. * - `ASKING_AI`: Waiting for the LLM to respond to the users question. * `EXECUTING_QUERY`: - Executing AI provided SQL query. Get the SQL query result by calling + `ASKING_AI`: Waiting for the LLM to respond to the users question. * `PENDING_WAREHOUSE`: + Waiting for warehouse before the SQL query can start executing. * `EXECUTING_QUERY`: Executing + AI provided SQL query. Get the SQL query result by calling [getMessageQueryResult](:method:genie/getMessageQueryResult) API. **Important: The message status will stay in the `EXECUTING_QUERY` until a client calls [getMessageQueryResult](:method:genie/getMessageQueryResult)**. * `FAILED`: Generating a @@ -696,6 +698,7 @@ class MessageStatus(Enum): FAILED = 'FAILED' FETCHING_METADATA = 'FETCHING_METADATA' FILTERING_CONTEXT = 'FILTERING_CONTEXT' + PENDING_WAREHOUSE = 'PENDING_WAREHOUSE' QUERY_RESULT_EXPIRED = 'QUERY_RESULT_EXPIRED' SUBMITTED = 'SUBMITTED' diff --git a/databricks/sdk/service/iam.py b/databricks/sdk/service/iam.py index 28e5247a6..2f752d06c 100755 --- a/databricks/sdk/service/iam.py +++ b/databricks/sdk/service/iam.py @@ -106,6 +106,58 @@ def from_dict(cls, d: Dict[str, any]) -> AccessControlResponse: user_name=d.get('user_name', None)) +@dataclass +class Actor: + """represents an identity trying to access a resource - user or a service principal group can be a + principal of a permission set assignment but an actor is always a user or a service principal""" + + actor_id: Optional[int] = None + + def as_dict(self) -> dict: + """Serializes the Actor into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.actor_id is not None: body['actor_id'] = self.actor_id + return body + + def as_shallow_dict(self) -> dict: + """Serializes the Actor into a shallow dictionary of its immediate attributes.""" + body = {} + if self.actor_id is not None: body['actor_id'] = self.actor_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> Actor: + """Deserializes the Actor from a dictionary.""" + return cls(actor_id=d.get('actor_id', None)) + + +@dataclass +class CheckPolicyResponse: + consistency_token: ConsistencyToken + + is_permitted: Optional[bool] = None + + def as_dict(self) -> dict: + """Serializes the CheckPolicyResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.consistency_token: body['consistency_token'] = self.consistency_token.as_dict() + if self.is_permitted is not None: body['is_permitted'] = self.is_permitted + return body + + def as_shallow_dict(self) -> dict: + """Serializes the CheckPolicyResponse into a shallow dictionary of its immediate attributes.""" + body = {} + if self.consistency_token: body['consistency_token'] = self.consistency_token + if self.is_permitted is not None: body['is_permitted'] = self.is_permitted + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CheckPolicyResponse: + """Deserializes the CheckPolicyResponse from a dictionary.""" + return cls(consistency_token=_from_dict(d, 'consistency_token', ConsistencyToken), + is_permitted=d.get('is_permitted', None)) + + @dataclass class ComplexValue: display: Optional[str] = None @@ -148,6 +200,28 @@ def from_dict(cls, d: Dict[str, any]) -> ComplexValue: value=d.get('value', None)) +@dataclass +class ConsistencyToken: + value: str + + def as_dict(self) -> dict: + """Serializes the ConsistencyToken into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.value is not None: body['value'] = self.value + return body + + def as_shallow_dict(self) -> dict: + """Serializes the ConsistencyToken into a shallow dictionary of its immediate attributes.""" + body = {} + if self.value is not None: body['value'] = self.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ConsistencyToken: + """Deserializes the ConsistencyToken from a dictionary.""" + return cls(value=d.get('value', None)) + + @dataclass class DeleteResponse: @@ -1219,6 +1293,49 @@ def from_dict(cls, d: Dict[str, any]) -> PrincipalOutput: user_name=d.get('user_name', None)) +class RequestAuthzIdentity(Enum): + """Defines the identity to be used for authZ of the request on the server side. See one pager for + for more information: http://go/acl/service-identity""" + + REQUEST_AUTHZ_IDENTITY_SERVICE_IDENTITY = 'REQUEST_AUTHZ_IDENTITY_SERVICE_IDENTITY' + REQUEST_AUTHZ_IDENTITY_USER_CONTEXT = 'REQUEST_AUTHZ_IDENTITY_USER_CONTEXT' + + +@dataclass +class ResourceInfo: + id: str + """Id of the current resource.""" + + legacy_acl_path: Optional[str] = None + """The legacy acl path of the current resource.""" + + parent_resource_info: Optional[ResourceInfo] = None + """Parent resource info for the current resource. The parent may have another parent.""" + + def as_dict(self) -> dict: + """Serializes the ResourceInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + if self.legacy_acl_path is not None: body['legacy_acl_path'] = self.legacy_acl_path + if self.parent_resource_info: body['parent_resource_info'] = self.parent_resource_info.as_dict() + return body + + def as_shallow_dict(self) -> dict: + """Serializes the ResourceInfo into a shallow dictionary of its immediate attributes.""" + body = {} + if self.id is not None: body['id'] = self.id + if self.legacy_acl_path is not None: body['legacy_acl_path'] = self.legacy_acl_path + if self.parent_resource_info: body['parent_resource_info'] = self.parent_resource_info + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ResourceInfo: + """Deserializes the ResourceInfo from a dictionary.""" + return cls(id=d.get('id', None), + legacy_acl_path=d.get('legacy_acl_path', None), + parent_resource_info=_from_dict(d, 'parent_resource_info', ResourceInfo)) + + @dataclass class ResourceMeta: resource_type: Optional[str] = None @@ -1622,6 +1739,47 @@ def from_dict(cls, d: Dict[str, any]) -> WorkspacePermissions: return cls(permissions=_repeated_dict(d, 'permissions', PermissionOutput)) +class AccessControlAPI: + """Rule based Access Control for Databricks Resources.""" + + def __init__(self, api_client): + self._api = api_client + + def check_policy(self, + actor: Actor, + permission: str, + resource: str, + consistency_token: ConsistencyToken, + authz_identity: RequestAuthzIdentity, + *, + resource_info: Optional[ResourceInfo] = None) -> CheckPolicyResponse: + """Check access policy to a resource. + + :param actor: :class:`Actor` + :param permission: str + :param resource: str + Ex: (servicePrincipal/use, accounts//servicePrincipals/) Ex: + (servicePrincipal.ruleSet/update, accounts//servicePrincipals//ruleSets/default) + :param consistency_token: :class:`ConsistencyToken` + :param authz_identity: :class:`RequestAuthzIdentity` + :param resource_info: :class:`ResourceInfo` (optional) + + :returns: :class:`CheckPolicyResponse` + """ + + query = {} + if actor is not None: query['actor'] = actor.as_dict() + if authz_identity is not None: query['authz_identity'] = authz_identity.value + if consistency_token is not None: query['consistency_token'] = consistency_token.as_dict() + if permission is not None: query['permission'] = permission + if resource is not None: query['resource'] = resource + if resource_info is not None: query['resource_info'] = resource_info.as_dict() + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', '/api/2.0/access-control/check-policy-v2', query=query, headers=headers) + return CheckPolicyResponse.from_dict(res) + + class AccountAccessControlAPI: """These APIs manage access rules on resources in an account. Currently, only grant rules are supported. A grant rule specifies a role assigned to a set of principals. A list of rules attached to a resource is diff --git a/databricks/sdk/service/sharing.py b/databricks/sdk/service/sharing.py index 000c85e2c..1990c7c54 100755 --- a/databricks/sdk/service/sharing.py +++ b/databricks/sdk/service/sharing.py @@ -35,7 +35,8 @@ class CreateProvider: """Description about the provider.""" recipient_profile_str: Optional[str] = None - """This field is required when the __authentication_type__ is **TOKEN** or not provided.""" + """This field is required when the __authentication_type__ is **TOKEN**, + **OAUTH_CLIENT_CREDENTIALS** or not provided.""" def as_dict(self) -> dict: """Serializes the CreateProvider into a dictionary suitable for use as a JSON request body.""" @@ -76,7 +77,7 @@ class CreateRecipient: """Description about the recipient.""" data_recipient_global_metastore_id: Optional[str] = None - """The global Unity Catalog metastore id provided by the data recipient. This field is required + """The global Unity Catalog metastore id provided by the data recipient. This field is only present when the __authentication_type__ is **DATABRICKS**. The identifier is of format __cloud__:__region__:__metastore-uuid__.""" @@ -90,10 +91,12 @@ class CreateRecipient: """Username of the recipient owner.""" properties_kvpairs: Optional[SecurablePropertiesKvPairs] = None - """Recipient properties as map of string key-value pairs.""" + """Recipient properties as map of string key-value pairs. When provided in update request, the + specified properties will override the existing properties. To add and remove properties, one + would need to perform a read-modify-write.""" sharing_code: Optional[str] = None - """The one-time sharing code provided by the data recipient. This field is required when the + """The one-time sharing code provided by the data recipient. This field is only present when the __authentication_type__ is **DATABRICKS**.""" def as_dict(self) -> dict: @@ -581,7 +584,7 @@ class ProviderInfo: data_provider_global_metastore_id: Optional[str] = None """The global UC metastore id of the data provider. This field is only present when the __authentication_type__ is **DATABRICKS**. The identifier is of format - ::.""" + __cloud__:__region__:__metastore-uuid__.""" metastore_id: Optional[str] = None """UUID of the provider's UC metastore. This field is only present when the __authentication_type__ @@ -594,10 +597,12 @@ class ProviderInfo: """Username of Provider owner.""" recipient_profile: Optional[RecipientProfile] = None - """The recipient profile. This field is only present when the authentication_type is `TOKEN`.""" + """The recipient profile. This field is only present when the authentication_type is `TOKEN` or + `OAUTH_CLIENT_CREDENTIALS`.""" recipient_profile_str: Optional[str] = None - """This field is only present when the authentication_type is `TOKEN` or not provided.""" + """This field is required when the __authentication_type__ is **TOKEN**, + **OAUTH_CLIENT_CREDENTIALS** or not provided.""" region: Optional[str] = None """Cloud region of the provider's UC metastore. This field is only present when the @@ -607,7 +612,7 @@ class ProviderInfo: """Time at which this Provider was created, in epoch milliseconds.""" updated_by: Optional[str] = None - """Username of user who last modified Share.""" + """Username of user who last modified Provider.""" def as_dict(self) -> dict: """Serializes the ProviderInfo into a dictionary suitable for use as a JSON request body.""" @@ -704,8 +709,8 @@ class RecipientInfo: """The delta sharing authentication type.""" cloud: Optional[str] = None - """Cloud vendor of the recipient's Unity Catalog Metstore. This field is only present when the - __authentication_type__ is **DATABRICKS**`.""" + """Cloud vendor of the recipient's Unity Catalog Metastore. This field is only present when the + __authentication_type__ is **DATABRICKS**.""" comment: Optional[str] = None """Description about the recipient.""" @@ -721,12 +726,15 @@ class RecipientInfo: when the __authentication_type__ is **DATABRICKS**. The identifier is of format __cloud__:__region__:__metastore-uuid__.""" + expiration_time: Optional[int] = None + """Expiration timestamp of the token, in epoch milliseconds.""" + ip_access_list: Optional[IpAccessList] = None """IP Access List""" metastore_id: Optional[str] = None - """Unique identifier of recipient's Unity Catalog metastore. This field is only present when the - __authentication_type__ is **DATABRICKS**""" + """Unique identifier of recipient's Unity Catalog Metastore. This field is only present when the + __authentication_type__ is **DATABRICKS**.""" name: Optional[str] = None """Name of Recipient.""" @@ -735,10 +743,12 @@ class RecipientInfo: """Username of the recipient owner.""" properties_kvpairs: Optional[SecurablePropertiesKvPairs] = None - """Recipient properties as map of string key-value pairs.""" + """Recipient properties as map of string key-value pairs. When provided in update request, the + specified properties will override the existing properties. To add and remove properties, one + would need to perform a read-modify-write.""" region: Optional[str] = None - """Cloud region of the recipient's Unity Catalog Metstore. This field is only present when the + """Cloud region of the recipient's Unity Catalog Metastore. This field is only present when the __authentication_type__ is **DATABRICKS**.""" sharing_code: Optional[str] = None @@ -766,6 +776,7 @@ def as_dict(self) -> dict: if self.created_by is not None: body['created_by'] = self.created_by if self.data_recipient_global_metastore_id is not None: body['data_recipient_global_metastore_id'] = self.data_recipient_global_metastore_id + if self.expiration_time is not None: body['expiration_time'] = self.expiration_time if self.ip_access_list: body['ip_access_list'] = self.ip_access_list.as_dict() if self.metastore_id is not None: body['metastore_id'] = self.metastore_id if self.name is not None: body['name'] = self.name @@ -790,6 +801,7 @@ def as_shallow_dict(self) -> dict: if self.created_by is not None: body['created_by'] = self.created_by if self.data_recipient_global_metastore_id is not None: body['data_recipient_global_metastore_id'] = self.data_recipient_global_metastore_id + if self.expiration_time is not None: body['expiration_time'] = self.expiration_time if self.ip_access_list: body['ip_access_list'] = self.ip_access_list if self.metastore_id is not None: body['metastore_id'] = self.metastore_id if self.name is not None: body['name'] = self.name @@ -813,6 +825,7 @@ def from_dict(cls, d: Dict[str, any]) -> RecipientInfo: created_at=d.get('created_at', None), created_by=d.get('created_by', None), data_recipient_global_metastore_id=d.get('data_recipient_global_metastore_id', None), + expiration_time=d.get('expiration_time', None), ip_access_list=_from_dict(d, 'ip_access_list', IpAccessList), metastore_id=d.get('metastore_id', None), name=d.get('name', None), @@ -869,7 +882,7 @@ class RecipientTokenInfo: retrieved.""" created_at: Optional[int] = None - """Time at which this recipient Token was created, in epoch milliseconds.""" + """Time at which this recipient token was created, in epoch milliseconds.""" created_by: Optional[str] = None """Username of recipient token creator.""" @@ -881,10 +894,10 @@ class RecipientTokenInfo: """Unique ID of the recipient token.""" updated_at: Optional[int] = None - """Time at which this recipient Token was updated, in epoch milliseconds.""" + """Time at which this recipient token was updated, in epoch milliseconds.""" updated_by: Optional[str] = None - """Username of recipient Token updater.""" + """Username of recipient token updater.""" def as_dict(self) -> dict: """Serializes the RecipientTokenInfo into a dictionary suitable for use as a JSON request body.""" @@ -973,7 +986,7 @@ class RotateRecipientToken: expire the existing token immediately, negative number will return an error.""" name: Optional[str] = None - """The name of the recipient.""" + """The name of the Recipient.""" def as_dict(self) -> dict: """Serializes the RotateRecipientToken into a dictionary suitable for use as a JSON request body.""" @@ -1023,9 +1036,6 @@ def from_dict(cls, d: Dict[str, any]) -> SecurablePropertiesKvPairs: return cls(properties=d.get('properties', None)) -SecurablePropertiesMap = Dict[str, str] - - @dataclass class ShareInfo: comment: Optional[str] = None @@ -1346,7 +1356,8 @@ class UpdateProvider: """Username of Provider owner.""" recipient_profile_str: Optional[str] = None - """This field is required when the __authentication_type__ is **TOKEN** or not provided.""" + """This field is required when the __authentication_type__ is **TOKEN**, + **OAUTH_CLIENT_CREDENTIALS** or not provided.""" def as_dict(self) -> dict: """Serializes the UpdateProvider into a dictionary suitable for use as a JSON request body.""" @@ -1393,7 +1404,7 @@ class UpdateRecipient: """Name of the recipient.""" new_name: Optional[str] = None - """New name for the recipient.""" + """New name for the recipient. .""" owner: Optional[str] = None """Username of the recipient owner.""" @@ -1439,25 +1450,6 @@ def from_dict(cls, d: Dict[str, any]) -> UpdateRecipient: properties_kvpairs=_from_dict(d, 'properties_kvpairs', SecurablePropertiesKvPairs)) -@dataclass -class UpdateResponse: - - def as_dict(self) -> dict: - """Serializes the UpdateResponse into a dictionary suitable for use as a JSON request body.""" - body = {} - return body - - def as_shallow_dict(self) -> dict: - """Serializes the UpdateResponse into a shallow dictionary of its immediate attributes.""" - body = {} - return body - - @classmethod - def from_dict(cls, d: Dict[str, any]) -> UpdateResponse: - """Deserializes the UpdateResponse from a dictionary.""" - return cls() - - @dataclass class UpdateShare: comment: Optional[str] = None @@ -1583,7 +1575,8 @@ def create(self, :param comment: str (optional) Description about the provider. :param recipient_profile_str: str (optional) - This field is required when the __authentication_type__ is **TOKEN** or not provided. + This field is required when the __authentication_type__ is **TOKEN**, **OAUTH_CLIENT_CREDENTIALS** + or not provided. :returns: :class:`ProviderInfo` """ @@ -1735,7 +1728,8 @@ def update(self, :param owner: str (optional) Username of Provider owner. :param recipient_profile_str: str (optional) - This field is required when the __authentication_type__ is **TOKEN** or not provided. + This field is required when the __authentication_type__ is **TOKEN**, **OAUTH_CLIENT_CREDENTIALS** + or not provided. :returns: :class:`ProviderInfo` """ @@ -1830,7 +1824,7 @@ def create(self, """Create a share recipient. Creates a new recipient with the delta sharing authentication type in the metastore. The caller must - be a metastore admin or has the **CREATE_RECIPIENT** privilege on the metastore. + be a metastore admin or have the **CREATE_RECIPIENT** privilege on the metastore. :param name: str Name of Recipient. @@ -1839,8 +1833,8 @@ def create(self, :param comment: str (optional) Description about the recipient. :param data_recipient_global_metastore_id: str (optional) - The global Unity Catalog metastore id provided by the data recipient. This field is required when - the __authentication_type__ is **DATABRICKS**. The identifier is of format + The global Unity Catalog metastore id provided by the data recipient. This field is only present + when the __authentication_type__ is **DATABRICKS**. The identifier is of format __cloud__:__region__:__metastore-uuid__. :param expiration_time: int (optional) Expiration timestamp of the token, in epoch milliseconds. @@ -1849,9 +1843,11 @@ def create(self, :param owner: str (optional) Username of the recipient owner. :param properties_kvpairs: :class:`SecurablePropertiesKvPairs` (optional) - Recipient properties as map of string key-value pairs. + Recipient properties as map of string key-value pairs. When provided in update request, the + specified properties will override the existing properties. To add and remove properties, one would + need to perform a read-modify-write. :param sharing_code: str (optional) - The one-time sharing code provided by the data recipient. This field is required when the + The one-time sharing code provided by the data recipient. This field is only present when the __authentication_type__ is **DATABRICKS**. :returns: :class:`RecipientInfo` @@ -1957,7 +1953,7 @@ def rotate_token(self, name: str, existing_token_expire_in_seconds: int) -> Reci The caller must be the owner of the recipient. :param name: str - The name of the recipient. + The name of the Recipient. :param existing_token_expire_in_seconds: int The expiration time of the bearer token in ISO 8601 format. This will set the expiration_time of existing token only to a smaller timestamp, it cannot extend the expiration_time. Use 0 to expire @@ -2021,7 +2017,7 @@ def update(self, ip_access_list: Optional[IpAccessList] = None, new_name: Optional[str] = None, owner: Optional[str] = None, - properties_kvpairs: Optional[SecurablePropertiesKvPairs] = None): + properties_kvpairs: Optional[SecurablePropertiesKvPairs] = None) -> RecipientInfo: """Update a share recipient. Updates an existing recipient in the metastore. The caller must be a metastore admin or the owner of @@ -2037,7 +2033,7 @@ def update(self, :param ip_access_list: :class:`IpAccessList` (optional) IP Access List :param new_name: str (optional) - New name for the recipient. + New name for the recipient. . :param owner: str (optional) Username of the recipient owner. :param properties_kvpairs: :class:`SecurablePropertiesKvPairs` (optional) @@ -2045,7 +2041,7 @@ def update(self, specified properties will override the existing properties. To add and remove properties, one would need to perform a read-modify-write. - + :returns: :class:`RecipientInfo` """ body = {} if comment is not None: body['comment'] = comment @@ -2056,7 +2052,8 @@ def update(self, if properties_kvpairs is not None: body['properties_kvpairs'] = properties_kvpairs.as_dict() headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } - self._api.do('PATCH', f'/api/2.1/unity-catalog/recipients/{name}', body=body, headers=headers) + res = self._api.do('PATCH', f'/api/2.1/unity-catalog/recipients/{name}', body=body, headers=headers) + return RecipientInfo.from_dict(res) class SharesAPI: diff --git a/databricks/sdk/version.py b/databricks/sdk/version.py index eb9b6f12e..9f86a39e2 100644 --- a/databricks/sdk/version.py +++ b/databricks/sdk/version.py @@ -1 +1 @@ -__version__ = '0.40.0' +__version__ = '0.41.0' diff --git a/docs/dbdataclasses/compute.rst b/docs/dbdataclasses/compute.rst index 9c628c476..6a9a06671 100644 --- a/docs/dbdataclasses/compute.rst +++ b/docs/dbdataclasses/compute.rst @@ -495,6 +495,18 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:class:: EventType + .. py:attribute:: ADD_NODES_FAILED + :value: "ADD_NODES_FAILED" + + .. py:attribute:: AUTOMATIC_CLUSTER_UPDATE + :value: "AUTOMATIC_CLUSTER_UPDATE" + + .. py:attribute:: AUTOSCALING_BACKOFF + :value: "AUTOSCALING_BACKOFF" + + .. py:attribute:: AUTOSCALING_FAILED + :value: "AUTOSCALING_FAILED" + .. py:attribute:: AUTOSCALING_STATS_REPORT :value: "AUTOSCALING_STATS_REPORT" diff --git a/docs/dbdataclasses/dashboards.rst b/docs/dbdataclasses/dashboards.rst index 22a3ea95d..6d0e847ba 100644 --- a/docs/dbdataclasses/dashboards.rst +++ b/docs/dbdataclasses/dashboards.rst @@ -254,7 +254,7 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:class:: MessageStatus - MesssageStatus. The possible values are: * `FETCHING_METADATA`: Fetching metadata from the data sources. * `FILTERING_CONTEXT`: Running smart context step to determine relevant context. * `ASKING_AI`: Waiting for the LLM to respond to the users question. * `EXECUTING_QUERY`: Executing AI provided SQL query. Get the SQL query result by calling [getMessageQueryResult](:method:genie/getMessageQueryResult) API. **Important: The message status will stay in the `EXECUTING_QUERY` until a client calls [getMessageQueryResult](:method:genie/getMessageQueryResult)**. * `FAILED`: Generating a response or the executing the query failed. Please see `error` field. * `COMPLETED`: Message processing is completed. Results are in the `attachments` field. Get the SQL query result by calling [getMessageQueryResult](:method:genie/getMessageQueryResult) API. * `SUBMITTED`: Message has been submitted. * `QUERY_RESULT_EXPIRED`: SQL result is not available anymore. The user needs to execute the query again. * `CANCELLED`: Message has been cancelled. + MesssageStatus. The possible values are: * `FETCHING_METADATA`: Fetching metadata from the data sources. * `FILTERING_CONTEXT`: Running smart context step to determine relevant context. * `ASKING_AI`: Waiting for the LLM to respond to the users question. * `PENDING_WAREHOUSE`: Waiting for warehouse before the SQL query can start executing. * `EXECUTING_QUERY`: Executing AI provided SQL query. Get the SQL query result by calling [getMessageQueryResult](:method:genie/getMessageQueryResult) API. **Important: The message status will stay in the `EXECUTING_QUERY` until a client calls [getMessageQueryResult](:method:genie/getMessageQueryResult)**. * `FAILED`: Generating a response or the executing the query failed. Please see `error` field. * `COMPLETED`: Message processing is completed. Results are in the `attachments` field. Get the SQL query result by calling [getMessageQueryResult](:method:genie/getMessageQueryResult) API. * `SUBMITTED`: Message has been submitted. * `QUERY_RESULT_EXPIRED`: SQL result is not available anymore. The user needs to execute the query again. * `CANCELLED`: Message has been cancelled. .. py:attribute:: ASKING_AI :value: "ASKING_AI" @@ -277,6 +277,9 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:attribute:: FILTERING_CONTEXT :value: "FILTERING_CONTEXT" + .. py:attribute:: PENDING_WAREHOUSE + :value: "PENDING_WAREHOUSE" + .. py:attribute:: QUERY_RESULT_EXPIRED :value: "QUERY_RESULT_EXPIRED" diff --git a/docs/dbdataclasses/iam.rst b/docs/dbdataclasses/iam.rst index 643da3d47..6df58ae4e 100644 --- a/docs/dbdataclasses/iam.rst +++ b/docs/dbdataclasses/iam.rst @@ -12,10 +12,22 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: +.. autoclass:: Actor + :members: + :undoc-members: + +.. autoclass:: CheckPolicyResponse + :members: + :undoc-members: + .. autoclass:: ComplexValue :members: :undoc-members: +.. autoclass:: ConsistencyToken + :members: + :undoc-members: + .. autoclass:: DeleteResponse :members: :undoc-members: @@ -242,6 +254,20 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: +.. py:class:: RequestAuthzIdentity + + Defines the identity to be used for authZ of the request on the server side. See one pager for for more information: http://go/acl/service-identity + + .. py:attribute:: REQUEST_AUTHZ_IDENTITY_SERVICE_IDENTITY + :value: "REQUEST_AUTHZ_IDENTITY_SERVICE_IDENTITY" + + .. py:attribute:: REQUEST_AUTHZ_IDENTITY_USER_CONTEXT + :value: "REQUEST_AUTHZ_IDENTITY_USER_CONTEXT" + +.. autoclass:: ResourceInfo + :members: + :undoc-members: + .. autoclass:: ResourceMeta :members: :undoc-members: diff --git a/docs/dbdataclasses/sharing.rst b/docs/dbdataclasses/sharing.rst index 2db59fcbe..ed4a4c006 100644 --- a/docs/dbdataclasses/sharing.rst +++ b/docs/dbdataclasses/sharing.rst @@ -343,10 +343,6 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: -.. autoclass:: UpdateResponse - :members: - :undoc-members: - .. autoclass:: UpdateShare :members: :undoc-members: diff --git a/docs/workspace/iam/access_control.rst b/docs/workspace/iam/access_control.rst new file mode 100644 index 000000000..a5f1feeda --- /dev/null +++ b/docs/workspace/iam/access_control.rst @@ -0,0 +1,23 @@ +``w.access_control``: RbacService +================================= +.. currentmodule:: databricks.sdk.service.iam + +.. py:class:: AccessControlAPI + + Rule based Access Control for Databricks Resources. + + .. py:method:: check_policy(actor: Actor, permission: str, resource: str, consistency_token: ConsistencyToken, authz_identity: RequestAuthzIdentity [, resource_info: Optional[ResourceInfo]]) -> CheckPolicyResponse + + Check access policy to a resource. + + :param actor: :class:`Actor` + :param permission: str + :param resource: str + Ex: (servicePrincipal/use, accounts//servicePrincipals/) Ex: + (servicePrincipal.ruleSet/update, accounts//servicePrincipals//ruleSets/default) + :param consistency_token: :class:`ConsistencyToken` + :param authz_identity: :class:`RequestAuthzIdentity` + :param resource_info: :class:`ResourceInfo` (optional) + + :returns: :class:`CheckPolicyResponse` + \ No newline at end of file diff --git a/docs/workspace/iam/index.rst b/docs/workspace/iam/index.rst index 2a98cc9ae..00a7f1fe7 100644 --- a/docs/workspace/iam/index.rst +++ b/docs/workspace/iam/index.rst @@ -7,6 +7,7 @@ Manage users, service principals, groups and their permissions in Accounts and W .. toctree:: :maxdepth: 1 + access_control account_access_control_proxy current_user groups diff --git a/docs/workspace/serving/serving_endpoints.rst b/docs/workspace/serving/serving_endpoints.rst index c0cd774a3..d9c806489 100644 --- a/docs/workspace/serving/serving_endpoints.rst +++ b/docs/workspace/serving/serving_endpoints.rst @@ -135,24 +135,23 @@ :returns: :class:`ServingEndpointPermissions` - .. py:method:: http_request(connection_name: str, method: ExternalFunctionRequestHttpMethod, path: str [, headers: Optional[str], json: Optional[str], params: Optional[str]]) -> ExternalFunctionResponse + .. py:method:: http_request(conn: str, method: ExternalFunctionRequestHttpMethod, path: str [, headers: typing.Dict[str, str], json: typing.Dict[str, str], params: typing.Dict[str, str]]) -> ExternalFunctionResponse Make external services call using the credentials stored in UC Connection. - - :param connection_name: str + **NOTE:** Experimental: This API may change or be removed in a future release without warning. + :param conn: str The connection name to use. This is required to identify the external connection. :param method: :class:`ExternalFunctionRequestHttpMethod` - The HTTP method to use (e.g., 'GET', 'POST'). + The HTTP method to use (e.g., 'GET', 'POST'). This is required. :param path: str The relative path for the API endpoint. This is required. - :param headers: str (optional) + :param headers: Dict[str,str] (optional) Additional headers for the request. If not provided, only auth headers from connections would be passed. - :param json: str (optional) - The JSON payload to send in the request body. - :param params: str (optional) + :param json: Dict[str,str] (optional) + JSON payload for the request. + :param params: Dict[str,str] (optional) Query parameters for the request. - :returns: :class:`ExternalFunctionResponse` diff --git a/docs/workspace/sharing/providers.rst b/docs/workspace/sharing/providers.rst index 7cf398ac0..7d27acc3d 100644 --- a/docs/workspace/sharing/providers.rst +++ b/docs/workspace/sharing/providers.rst @@ -44,7 +44,8 @@ :param comment: str (optional) Description about the provider. :param recipient_profile_str: str (optional) - This field is required when the __authentication_type__ is **TOKEN** or not provided. + This field is required when the __authentication_type__ is **TOKEN**, **OAUTH_CLIENT_CREDENTIALS** + or not provided. :returns: :class:`ProviderInfo` @@ -228,7 +229,8 @@ :param owner: str (optional) Username of Provider owner. :param recipient_profile_str: str (optional) - This field is required when the __authentication_type__ is **TOKEN** or not provided. + This field is required when the __authentication_type__ is **TOKEN**, **OAUTH_CLIENT_CREDENTIALS** + or not provided. :returns: :class:`ProviderInfo` \ No newline at end of file diff --git a/docs/workspace/sharing/recipients.rst b/docs/workspace/sharing/recipients.rst index 44f2042bb..76e1da171 100644 --- a/docs/workspace/sharing/recipients.rst +++ b/docs/workspace/sharing/recipients.rst @@ -39,7 +39,7 @@ Create a share recipient. Creates a new recipient with the delta sharing authentication type in the metastore. The caller must - be a metastore admin or has the **CREATE_RECIPIENT** privilege on the metastore. + be a metastore admin or have the **CREATE_RECIPIENT** privilege on the metastore. :param name: str Name of Recipient. @@ -48,8 +48,8 @@ :param comment: str (optional) Description about the recipient. :param data_recipient_global_metastore_id: str (optional) - The global Unity Catalog metastore id provided by the data recipient. This field is required when - the __authentication_type__ is **DATABRICKS**. The identifier is of format + The global Unity Catalog metastore id provided by the data recipient. This field is only present + when the __authentication_type__ is **DATABRICKS**. The identifier is of format __cloud__:__region__:__metastore-uuid__. :param expiration_time: int (optional) Expiration timestamp of the token, in epoch milliseconds. @@ -58,9 +58,11 @@ :param owner: str (optional) Username of the recipient owner. :param properties_kvpairs: :class:`SecurablePropertiesKvPairs` (optional) - Recipient properties as map of string key-value pairs. + Recipient properties as map of string key-value pairs. When provided in update request, the + specified properties will override the existing properties. To add and remove properties, one would + need to perform a read-modify-write. :param sharing_code: str (optional) - The one-time sharing code provided by the data recipient. This field is required when the + The one-time sharing code provided by the data recipient. This field is only present when the __authentication_type__ is **DATABRICKS**. :returns: :class:`RecipientInfo` @@ -174,7 +176,7 @@ The caller must be the owner of the recipient. :param name: str - The name of the recipient. + The name of the Recipient. :param existing_token_expire_in_seconds: int The expiration time of the bearer token in ISO 8601 format. This will set the expiration_time of existing token only to a smaller timestamp, it cannot extend the expiration_time. Use 0 to expire @@ -224,7 +226,7 @@ :returns: :class:`GetRecipientSharePermissionsResponse` - .. py:method:: update(name: str [, comment: Optional[str], expiration_time: Optional[int], ip_access_list: Optional[IpAccessList], new_name: Optional[str], owner: Optional[str], properties_kvpairs: Optional[SecurablePropertiesKvPairs]]) + .. py:method:: update(name: str [, comment: Optional[str], expiration_time: Optional[int], ip_access_list: Optional[IpAccessList], new_name: Optional[str], owner: Optional[str], properties_kvpairs: Optional[SecurablePropertiesKvPairs]]) -> RecipientInfo Usage: @@ -259,7 +261,7 @@ :param ip_access_list: :class:`IpAccessList` (optional) IP Access List :param new_name: str (optional) - New name for the recipient. + New name for the recipient. . :param owner: str (optional) Username of the recipient owner. :param properties_kvpairs: :class:`SecurablePropertiesKvPairs` (optional) @@ -267,5 +269,5 @@ specified properties will override the existing properties. To add and remove properties, one would need to perform a read-modify-write. - + :returns: :class:`RecipientInfo` \ No newline at end of file