-
Notifications
You must be signed in to change notification settings - Fork 193
Comparing changes
Open a pull request
base repository: databricks/databricks-sdk-py
base: v0.12.0
head repository: databricks/databricks-sdk-py
compare: v0.13.0
- 10 commits
- 78 files changed
- 6 contributors
Commits on Oct 26, 2023
-
Configuration menu - View commit details
-
Copy full SHA for 2401940 - Browse repository at this point
Copy the full SHA 2401940View commit details
Commits on Nov 1, 2023
-
[JOBS-11439] Add taskValues support in remoteDbUtils (#406)
## Changes Support `dbutils.jobs.taskValues` methods in `RemoteDbUtils` and add additional `checkTaskValue` to allow local testing ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Added three test cases in `test_dbutils.py`. Integration tests are not needed since changes only for remote mode - [x] `make test` run locally - [x] `make fmt` applied - [x] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 9955c08 - Browse repository at this point
Copy the full SHA 9955c08View commit details
Commits on Nov 3, 2023
-
Add more detailed error message on default credentials not found error (
#419) ## Changes <!-- Summary of your changes that are easy to understand --> - Added more detailed error message for default credentials not found error - found the original error message was a bit difficult to follow / understand ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Not quite sure if I need to do anything special if CI checks passed. Manually tried using default auth locally without `.databrickscfg` file, verified that the prompt is printed out - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 5bd7f6b - Browse repository at this point
Copy the full SHA 5bd7f6bView commit details
Commits on Nov 6, 2023
-
Make test_auth no longer fail if you have a default profile setup (#426)
## Changes <!-- Summary of your changes that are easy to understand --> - Previously `test_auth.py` will have several test cases failing if you have a default profile locally, that's because the sdk code will always look for the `~/.databrickscfg` file, which exists on your machine, but is not expected - Making specific test cases affected by this behavior use a fake file system instead of the local disk to bypass this issue. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for ee54c3c - Browse repository at this point
Copy the full SHA ee54c3cView commit details
Commits on Nov 8, 2023
-
Request management token via Azure CLI only for Service Principals an…
…d not human users (#408) ## Changes This PR removes the call to `az account get-access-token` to retrieve the management token if the machine is authenticated with a non-human user. ## Tests - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for a862adc - Browse repository at this point
Copy the full SHA a862adcView commit details
Commits on Nov 13, 2023
-
Introduce more specific exceptions, like
NotFound,AlreadyExists,…… `BadRequest`, `PermissionDenied`, `InternalError`, and others (#376) Improve the ergonomics of SDK, where instead of `except DatabricksError as err: if err.error_code != 'NOT_FOUND': raise err else: do_stuff()` we could do `except NotFound: do_stuff()`, like in [this example](https://github.com/databrickslabs/ucx/blob/main/src/databricks/labs/ucx/workspace_access/generic.py#L71-L84). Additionally, it'll make it easier to read stack traces, as they will contain specific exception class name. Examples of unclear stack traces are: databrickslabs/ucx#359, databrickslabs/ucx#353, databrickslabs/ucx#347, # First principles - ~~do not override `builtins.NotImplemented` for `NOT_IMPLEMENTED` error code~~ - assume that platform error_code/HTTP status code mapping is not perfect and in the state of transition - we do represent reasonable subset of error codes as specific exceptions - it's still possible to access `error_code` from specific exceptions like `NotFound` or `AlreadyExists`. # Proposal - have hierarchical exceptions, also inheriting from Python's built-in exceptions - more specific error codes override more generic HTTP status codes - more generic HTTP status codes matched after more specific error codes, where there's a default exception class per HTTP status code, and we do rely on Databricks platform exception mapper to do the right thing. - have backward-compatible error creation for cases like using older versions of the SDK on the way never releases of the platform.  ### Naming conflict resolution We have four sources of naming and this is a proposed order of naming conflict resolution: 1. Databricks `error_code`, that is surfaced in our API documentation, known by Databricks users 2. HTTP Status codes, known by some developers 3. Python builtin exceptions, known by some developers 4. grpc error codes https://github.com/googleapis/googleapis/blob/master/google/rpc/code.proto#L38-L185, know by some developers --------- Co-authored-by: Miles Yucht <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 93a622d - Browse repository at this point
Copy the full SHA 93a622dView commit details -
Regenerate Python SDK after bumping Deco tool (#438)
## Changes Several changes were made to our codegen tool: 1. Path parameters will always precede body parameters in methods. This ensures that path parameters always come first, i.e. the ID of the object. This is a pretty bad breaking change for customers, but fortunately only a handful of APIs, primarily update APIs, are affected, namely: * AlertsAPI.update * DashboardWidgetsAPI.update * RecipientsAPI.rotate_token * AccountIPAccessListsAPI.update and AccountsIPAccessListsAPI.replace * IPAccessListsAPI.update and IPAccessListsAPI.replace * ServingEndpointsAPI.update_config and ServingEndpointsAPI.update_config_and_wait * PrivateAccessAPI.replace * WorkspaceAssignmentsAPI.update * GlobalInitScriptsAPI.update * MetastoresAPI.assign * ConnectionsAPI.update * ArtifactAllowlistsAPI.update * BudgetsAPI.update * LogDeliveryAPI.patch_status 2. New integration tests were added, so there are new autogenerated examples in the SDK. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 69aa629 - Browse repository at this point
Copy the full SHA 69aa629View commit details
Commits on Nov 14, 2023
-
Update OpenAPI spec for Python SDK (#439)
## Changes This PR updates the OpenAPI spec for the Python SDK to d136ad0541f036372601bad9a4382db06c3c912d. As part of this, we add some protection against fields or methods in our spec that conflict with reserved keywords in Python. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 50c71a1 - Browse repository at this point
Copy the full SHA 50c71a1View commit details -
Paginate all SCIM list requests in the SDK (#440)
## Changes This PR incorporates two hard-coded changes for the SCIM API in the Python SDK: 1. startIndex starts at 1 for SCIM APIs, not 0. However, the existing .Pagination.Increment controls both the start index as well as whether the pagination is per-page or per-resource. Later, we should replace this extension with two independent OpenAPI options: `one_indexed` (defaulting to `false`) and `pagination_basis` (defaulting to `resource` but can be overridden to `page`). 2. If users don't specify a limit, the SDK will include a hard-coded limit of 100 resources per request. We could add this to the OpenAPI spec as an option `default_limit`, which is useful for any non-paginated APIs that later expose pagination options and allow the SDK to gracefully support those. However, we don't want to encourage folks to use this pattern: all new list APIs are required to be paginated from the start. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied --------- Co-authored-by: Xinjie Zheng <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 9ba48cc - Browse repository at this point
Copy the full SHA 9ba48ccView commit details -
* Introduce more specific exceptions, like `NotFound`, `AlreadyExists`, `BadRequest`, `PermissionDenied`, `InternalError`, and others ([#376](#376)). This makes it easier to handle errors thrown by the Databricks API. Instead of catching `DatabricksError` and checking the error_code field, you can catch one of these subtypes of `DatabricksError`, which is more ergonomic and removes the need to rethrow exceptions that you don't want to catch. For example: ```python try: return (self._ws .permissions .get(object_type, object_id)) except DatabricksError as e: if e.error_code in [ "RESOURCE_DOES_NOT_EXIST", "RESOURCE_NOT_FOUND", "PERMISSION_DENIED", "FEATURE_DISABLED", "BAD_REQUEST"]: logger.warning(...) return None raise RetryableError(...) from e ``` can be replaced with ```python try: return (self._ws .permissions .get(object_type, object_id)) except PermissionDenied, FeatureDisabled: logger.warning(...) return None except NotFound: raise RetryableError(...) ``` * Paginate all SCIM list requests in the SDK ([#440](#440)). This change ensures that SCIM list() APIs use a default limit of 100 resources, leveraging SCIM's offset + limit pagination to batch requests to the Databricks API. * Added taskValues support in remoteDbUtils ([#406](#406)). * Added more detailed error message on default credentials not found error ([#419](#419)). * Request management token via Azure CLI only for Service Principals and not human users ([#408](#408)). API Changes: * Fixed `create()` method for [w.functions](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/functions.html) workspace-level service and corresponding `databricks.sdk.service.catalog.CreateFunction` and `databricks.sdk.service.catalog.FunctionInfo` dataclasses. * Changed `create()` method for [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html) workspace-level service with new required argument order. * Changed `storage_root` field for `databricks.sdk.service.catalog.CreateMetastore` to be optional. * Added `skip_validation` field for `databricks.sdk.service.catalog.UpdateExternalLocation`. * Added `libraries` field for `databricks.sdk.service.compute.CreatePolicy`, `databricks.sdk.service.compute.EditPolicy` and `databricks.sdk.service.compute.Policy`. * Added `init_scripts` field for `databricks.sdk.service.compute.EventDetails`. * Added `file` field for `databricks.sdk.service.compute.InitScriptInfo`. * Added `zone_id` field for `databricks.sdk.service.compute.InstancePoolGcpAttributes`. * Added several dataclasses related to init scripts. * Added `databricks.sdk.service.compute.LocalFileInfo` dataclass. * Replaced `ui_state` field with `edit_mode` for `databricks.sdk.service.jobs.CreateJob` and `databricks.sdk.service.jobs.JobSettings`. * Replaced `databricks.sdk.service.jobs.CreateJobUiState` dataclass with `databricks.sdk.service.jobs.CreateJobEditMode`. * Added `include_resolved_values` field for `databricks.sdk.service.jobs.GetRunRequest`. * Replaced `databricks.sdk.service.jobs.JobSettingsUiState` dataclass with `databricks.sdk.service.jobs.JobSettingsEditMode`. * Removed [a.o_auth_enrollment](https://databricks-sdk-py.readthedocs.io/en/latest/account/o_auth_enrollment.html) account-level service. This was only used to aid in OAuth enablement during the public preview of OAuth. OAuth is now enabled for all AWS E2 accounts, so usage of this API is no longer needed. * Added `network_connectivity_config_id` field for `databricks.sdk.service.provisioning.UpdateWorkspaceRequest`. * Added [a.network_connectivity](https://databricks-sdk-py.readthedocs.io/en/latest/account/network_connectivity.html) account-level service. * Added `string_shared_as` field for `databricks.sdk.service.sharing.SharedDataObject`. Internal changes: * Added regression question to issue template ([#414](#414)). * Made test_auth no longer fail if you have a default profile setup ([#426](#426)). OpenAPI SHA: d136ad0541f036372601bad9a4382db06c3c912d, Date: 2023-11-14
Configuration menu - View commit details
-
Copy full SHA for 86ca043 - Browse repository at this point
Copy the full SHA 86ca043View commit details
This comparison is taking too long to generate.
Unfortunately it looks like we can’t render this comparison for you right now. It might be too big, or there might be something weird with your repository.
You can try running this command locally to see the comparison on your machine:
git diff v0.12.0...v0.13.0