Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: databricks/databricks-sdk-py
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.12.0
Choose a base ref
...
head repository: databricks/databricks-sdk-py
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v0.13.0
Choose a head ref
  • 10 commits
  • 78 files changed
  • 6 contributors

Commits on Oct 26, 2023

  1. Configuration menu
    Copy the full SHA
    2401940 View commit details
    Browse the repository at this point in the history

Commits on Nov 1, 2023

  1. [JOBS-11439] Add taskValues support in remoteDbUtils (#406)

    ## Changes
    Support `dbutils.jobs.taskValues` methods in `RemoteDbUtils` and add
    additional `checkTaskValue` to allow local testing
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    Added three test cases in `test_dbutils.py`. Integration tests are not
    needed since changes only for remote mode
    - [x] `make test` run locally
    - [x] `make fmt` applied
    - [x] relevant integration tests applied
    vsamoilov authored Nov 1, 2023
    Configuration menu
    Copy the full SHA
    9955c08 View commit details
    Browse the repository at this point in the history

Commits on Nov 3, 2023

  1. Add more detailed error message on default credentials not found error (

    #419)
    
    ## Changes
    <!-- Summary of your changes that are easy to understand -->
    - Added more detailed error message for default credentials not found
    error - found the original error message was a bit difficult to follow /
    understand
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    Not quite sure if I need to do anything special if CI checks passed.
    Manually tried using default auth locally without `.databrickscfg` file,
    verified that the prompt is printed out
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    edwardfeng-db authored Nov 3, 2023
    Configuration menu
    Copy the full SHA
    5bd7f6b View commit details
    Browse the repository at this point in the history

Commits on Nov 6, 2023

  1. Make test_auth no longer fail if you have a default profile setup (#426)

    ## Changes
    <!-- Summary of your changes that are easy to understand -->
    - Previously `test_auth.py` will have several test cases failing if you
    have a default profile locally, that's because the sdk code will always
    look for the `~/.databrickscfg` file, which exists on your machine, but
    is not expected
    - Making specific test cases affected by this behavior use a fake file
    system instead of the local disk to bypass this issue.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    edwardfeng-db authored Nov 6, 2023
    Configuration menu
    Copy the full SHA
    ee54c3c View commit details
    Browse the repository at this point in the history

Commits on Nov 8, 2023

  1. Request management token via Azure CLI only for Service Principals an…

    …d not human users (#408)
    
    ## Changes
    This PR removes the call to `az account get-access-token` to retrieve
    the management token if the machine is authenticated with a non-human
    user.
    
    
    ## Tests
    
    - [x] `make test` run locally
    - [x] `make fmt` applied
    - [ ] relevant integration tests applied
    nfx authored Nov 8, 2023
    Configuration menu
    Copy the full SHA
    a862adc View commit details
    Browse the repository at this point in the history

Commits on Nov 13, 2023

  1. Introduce more specific exceptions, like NotFound, AlreadyExists,…

    … `BadRequest`, `PermissionDenied`, `InternalError`, and others (#376)
    
    Improve the ergonomics of SDK, where instead of `except DatabricksError
    as err: if err.error_code != 'NOT_FOUND': raise err else: do_stuff()` we
    could do `except NotFound: do_stuff()`, like in [this
    example](https://github.com/databrickslabs/ucx/blob/main/src/databricks/labs/ucx/workspace_access/generic.py#L71-L84).
    
    Additionally, it'll make it easier to read stack traces, as they will
    contain specific exception class name. Examples of unclear stack traces
    are: databrickslabs/ucx#359,
    databrickslabs/ucx#353,
    databrickslabs/ucx#347,
    
    # First principles
    - ~~do not override `builtins.NotImplemented` for `NOT_IMPLEMENTED`
    error code~~
    - assume that platform error_code/HTTP status code mapping is not
    perfect and in the state of transition
    - we do represent reasonable subset of error codes as specific
    exceptions
    - it's still possible to access `error_code` from specific exceptions
    like `NotFound` or `AlreadyExists`.
    
    # Proposal
    - have hierarchical exceptions, also inheriting from Python's built-in
    exceptions
    - more specific error codes override more generic HTTP status codes
    - more generic HTTP status codes matched after more specific error
    codes, where there's a default exception class per HTTP status code, and
    we do rely on Databricks platform exception mapper to do the right
    thing.
    - have backward-compatible error creation for cases like using older
    versions of the SDK on the way never releases of the platform.
    
    
    ![image](https://github.com/databricks/databricks-sdk-py/assets/259697/a4519f76-0778-468c-9bf5-6133984b5af7)
    
    ### Naming conflict resolution
    
    We have four sources of naming and this is a proposed order of naming
    conflict resolution:
    1. Databricks `error_code`, that is surfaced in our API documentation,
    known by Databricks users
    2. HTTP Status codes, known by some developers
    3. Python builtin exceptions, known by some developers
    4. grpc error codes
    https://github.com/googleapis/googleapis/blob/master/google/rpc/code.proto#L38-L185,
    know by some developers
    
    ---------
    
    Co-authored-by: Miles Yucht <[email protected]>
    nfx and mgyucht authored Nov 13, 2023
    Configuration menu
    Copy the full SHA
    93a622d View commit details
    Browse the repository at this point in the history
  2. Regenerate Python SDK after bumping Deco tool (#438)

    ## Changes
    Several changes were made to our codegen tool:
    1. Path parameters will always precede body parameters in methods. This
    ensures that path parameters always come first, i.e. the ID of the
    object. This is a pretty bad breaking change for customers, but
    fortunately only a handful of APIs, primarily update APIs, are affected,
    namely:
      * AlertsAPI.update
      * DashboardWidgetsAPI.update
      * RecipientsAPI.rotate_token
      * AccountIPAccessListsAPI.update and AccountsIPAccessListsAPI.replace
      * IPAccessListsAPI.update and IPAccessListsAPI.replace
    * ServingEndpointsAPI.update_config and
    ServingEndpointsAPI.update_config_and_wait
      * PrivateAccessAPI.replace
      * WorkspaceAssignmentsAPI.update
      * GlobalInitScriptsAPI.update
      * MetastoresAPI.assign
      * ConnectionsAPI.update
      * ArtifactAllowlistsAPI.update
      * BudgetsAPI.update
      * LogDeliveryAPI.patch_status
    2. New integration tests were added, so there are new autogenerated
    examples in the SDK.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Nov 13, 2023
    Configuration menu
    Copy the full SHA
    69aa629 View commit details
    Browse the repository at this point in the history

Commits on Nov 14, 2023

  1. Update OpenAPI spec for Python SDK (#439)

    ## Changes
    This PR updates the OpenAPI spec for the Python SDK to
    d136ad0541f036372601bad9a4382db06c3c912d. As part of this, we add some
    protection against fields or methods in our spec that conflict with
    reserved keywords in Python.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Nov 14, 2023
    Configuration menu
    Copy the full SHA
    50c71a1 View commit details
    Browse the repository at this point in the history
  2. Paginate all SCIM list requests in the SDK (#440)

    ## Changes
    This PR incorporates two hard-coded changes for the SCIM API in the
    Python SDK:
    1. startIndex starts at 1 for SCIM APIs, not 0. However, the existing
    .Pagination.Increment controls both the start index as well as whether
    the pagination is per-page or per-resource. Later, we should replace
    this extension with two independent OpenAPI options: `one_indexed`
    (defaulting to `false`) and `pagination_basis` (defaulting to `resource`
    but can be overridden to `page`).
    2. If users don't specify a limit, the SDK will include a hard-coded
    limit of 100 resources per request. We could add this to the OpenAPI
    spec as an option `default_limit`, which is useful for any non-paginated
    APIs that later expose pagination options and allow the SDK to
    gracefully support those. However, we don't want to encourage folks to
    use this pattern: all new list APIs are required to be paginated from
    the start.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    
    ---------
    
    Co-authored-by: Xinjie Zheng <[email protected]>
    mgyucht and xinjiezhen-db authored Nov 14, 2023
    Configuration menu
    Copy the full SHA
    9ba48cc View commit details
    Browse the repository at this point in the history
  3. Release v0.13.0 (#442)

    * Introduce more specific exceptions, like `NotFound`, `AlreadyExists`,
    `BadRequest`, `PermissionDenied`, `InternalError`, and others
    ([#376](#376)). This
    makes it easier to handle errors thrown by the Databricks API. Instead
    of catching `DatabricksError` and checking the error_code field, you can
    catch one of these subtypes of `DatabricksError`, which is more
    ergonomic and removes the need to rethrow exceptions that you don't want
    to catch. For example:
    ```python
    try:
      return (self._ws
        .permissions
        .get(object_type, object_id))
    except DatabricksError as e:
      if e.error_code in [
        "RESOURCE_DOES_NOT_EXIST",
        "RESOURCE_NOT_FOUND",
        "PERMISSION_DENIED",
        "FEATURE_DISABLED",
        "BAD_REQUEST"]:
        logger.warning(...)
        return None
      raise RetryableError(...) from e
    ```
    can be replaced with
    ```python
    try:
      return (self._ws
        .permissions
        .get(object_type, object_id))
    except PermissionDenied, FeatureDisabled:
      logger.warning(...)
      return None
    except NotFound:
      raise RetryableError(...)
    ```
    * Paginate all SCIM list requests in the SDK
    ([#440](#440)). This
    change ensures that SCIM list() APIs use a default limit of 100
    resources, leveraging SCIM's offset + limit pagination to batch requests
    to the Databricks API.
    * Added taskValues support in remoteDbUtils
    ([#406](#406)).
    * Added more detailed error message on default credentials not found
    error
    ([#419](#419)).
    * Request management token via Azure CLI only for Service Principals and
    not human users
    ([#408](#408)).
    
    API Changes:
    
    * Fixed `create()` method for
    [w.functions](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/functions.html)
    workspace-level service and corresponding
    `databricks.sdk.service.catalog.CreateFunction` and
    `databricks.sdk.service.catalog.FunctionInfo` dataclasses.
    * Changed `create()` method for
    [w.metastores](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html)
    workspace-level service with new required argument order.
    * Changed `storage_root` field for
    `databricks.sdk.service.catalog.CreateMetastore` to be optional.
    * Added `skip_validation` field for
    `databricks.sdk.service.catalog.UpdateExternalLocation`.
    * Added `libraries` field for
    `databricks.sdk.service.compute.CreatePolicy`,
    `databricks.sdk.service.compute.EditPolicy` and
    `databricks.sdk.service.compute.Policy`.
    * Added `init_scripts` field for
    `databricks.sdk.service.compute.EventDetails`.
    * Added `file` field for
    `databricks.sdk.service.compute.InitScriptInfo`.
    * Added `zone_id` field for
    `databricks.sdk.service.compute.InstancePoolGcpAttributes`.
     * Added several dataclasses related to init scripts.
     * Added `databricks.sdk.service.compute.LocalFileInfo` dataclass.
    * Replaced `ui_state` field with `edit_mode` for
    `databricks.sdk.service.jobs.CreateJob` and
    `databricks.sdk.service.jobs.JobSettings`.
    * Replaced `databricks.sdk.service.jobs.CreateJobUiState` dataclass with
    `databricks.sdk.service.jobs.CreateJobEditMode`.
    * Added `include_resolved_values` field for
    `databricks.sdk.service.jobs.GetRunRequest`.
    * Replaced `databricks.sdk.service.jobs.JobSettingsUiState` dataclass
    with `databricks.sdk.service.jobs.JobSettingsEditMode`.
    * Removed
    [a.o_auth_enrollment](https://databricks-sdk-py.readthedocs.io/en/latest/account/o_auth_enrollment.html)
    account-level service. This was only used to aid in OAuth enablement
    during the public preview of OAuth. OAuth is now enabled for all AWS E2
    accounts, so usage of this API is no longer needed.
    * Added `network_connectivity_config_id` field for
    `databricks.sdk.service.provisioning.UpdateWorkspaceRequest`.
    * Added
    [a.network_connectivity](https://databricks-sdk-py.readthedocs.io/en/latest/account/network_connectivity.html)
    account-level service.
    * Added `string_shared_as` field for
    `databricks.sdk.service.sharing.SharedDataObject`.
    
    Internal changes:
    
    * Added regression question to issue template
    ([#414](#414)).
    * Made test_auth no longer fail if you have a default profile setup
    ([#426](#426)).
    
    OpenAPI SHA: d136ad0541f036372601bad9a4382db06c3c912d, Date: 2023-11-14
    mgyucht authored Nov 14, 2023
    Configuration menu
    Copy the full SHA
    86ca043 View commit details
    Browse the repository at this point in the history
Loading