Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: databricks/databricks-sdk-py
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.20.0
Choose a base ref
...
head repository: databricks/databricks-sdk-py
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v0.21.0
Choose a head ref
  • 15 commits
  • 85 files changed
  • 8 contributors

Commits on Feb 21, 2024

  1. Fix get_workspace_client in GCP (#532)

    ## Changes
    The current implementation of get_workspace_client copies the entire
    config, critically reusing the cached header factory so as to use the
    same auth mechanism when getting a workspace client. However, at least
    in GCP, account-level OAuth tokens can't be used to authenticate to a
    workspace (probably because the audience for the account-level and
    workspace-level tokens is different).
    
    This PR fixes this by only copying the exported fields and not copying
    the header factory. Subsequent use of the config in WorkspaceClient will
    trigger config resolution. For GCP, this means creating a new token
    source using the correct host as the audience.
    
    This ports databricks/databricks-sdk-go#803 to
    the Python SDK.
    
    ## Tests
    Manually ran this integration test in all non-UC (Azure, AWS, GCP) and
    UC (AWS, GCP) account-level environments.
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Feb 21, 2024
    Configuration menu
    Copy the full SHA
    f8e64b2 View commit details
    Browse the repository at this point in the history
  2. Fix integer deserialization for headers (#553)

    ## Changes
    Fix integer deserialization for headers. Headers are always serialized
    as strings by Http convention, so we need to cast on deserialization.
    
    ## Tests
    Used the new tests for files API
    https://github.com/databricks/databricks-sdk-py/pull/552/files
    hectorcast-db authored Feb 21, 2024
    Configuration menu
    Copy the full SHA
    0d3e9da View commit details
    Browse the repository at this point in the history

Commits on Feb 22, 2024

  1. Add Files API docs to the SDK Documentation (#556)

    ## Changes
    Files API docs were initially hidden from SDK documentation. This PR
    unhides them.
    
    See
    https://databricks-sdk-py--556.org.readthedocs.build/en/556/workspace/files/index.html
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Feb 22, 2024
    Configuration menu
    Copy the full SHA
    b67f1c4 View commit details
    Browse the repository at this point in the history
  2. Add back enums to docs (#557)

    ## Changes
    Enums went missing from the documentation. This PR adds them back in.
    
    See
    https://databricks-sdk-py--557.org.readthedocs.build/en/557/dbdataclasses/catalog.html#databricks.sdk.service.catalog.ValidationResultOperation
    for the missing enum.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Feb 22, 2024
    Configuration menu
    Copy the full SHA
    326088f View commit details
    Browse the repository at this point in the history
  3. Sort index pages by name in docs (#560)

    ## Changes
    The indices under the APIs and Dataclasses pages were not sorted,
    resulting in them being shown in an unexpected/random order. This fixes
    this so they are shown alphabetically.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Feb 22, 2024
    Configuration menu
    Copy the full SHA
    8537cd3 View commit details
    Browse the repository at this point in the history
  4. Add integration tests for Files API (#552)

    ## Changes
    Adding integration tests for the Files API.
    
    I'm following the same structure as for the go-sdk integration tests:
    https://github.com/databricks/databricks-sdk-go/blob/main/internal/files_test.go
    
    ## Tests
    These are the tests.
    
    <img width="567" alt="image"
    src="https://github.com/databricks/databricks-sdk-py/assets/36550428/ed391bba-f53d-4327-a018-d87afdd69c63">
    
    
    - [x] `make test` run locally
    - [x] `make fmt` applied
    - [x] relevant integration tests applied
    bonnetn authored Feb 22, 2024
    Configuration menu
    Copy the full SHA
    3a2798f View commit details
    Browse the repository at this point in the history

Commits on Feb 23, 2024

  1. Distinguish between empty types and fields that can take any value (#561

    )
    
    ## Changes
    Ports databricks/databricks-sdk-go#821 to the
    Python SDK.
    
    This also bumps the API spec to the same version as in the above PR.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Feb 23, 2024
    Configuration menu
    Copy the full SHA
    0cb9e42 View commit details
    Browse the repository at this point in the history
  2. Support subservices (#559)

    ## Changes
    Support subservices
    Test PR with custom OpenAPI spec
    #558
    
    Docs for linked PR
    
    https://databricks-sdk-py--558.org.readthedocs.build/en/558/workspace/settings/index.html#
    
    ## Tests
    
    - [X] `make test` run locally
    - [X] `make fmt` applied
    - [X] relevant integration tests applied
    hectorcast-db authored Feb 23, 2024
    Configuration menu
    Copy the full SHA
    2a91e8d View commit details
    Browse the repository at this point in the history
  3. Use all-apis scope with external-browser (#563)

    ## Changes
    For all OAuth flows, we pretty much only support the `all-apis` scope
    now. This fixes `external-browser` credentials provider to use this
    scope, as the other scopes there do not work.
    
    ## Tests
    Ran the local_browser_oauth.py example, which worked.
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    mgyucht authored Feb 23, 2024
    Configuration menu
    Copy the full SHA
    5255760 View commit details
    Browse the repository at this point in the history

Commits on Feb 28, 2024

  1. Make a best effort attempt to initialise all Databricks globals (#562)

    ## Changes
    We only initialise dbutils locally when using `from
    databricks.sdk.runtime import *`. Users in the webui are guided to use
    this import in all library code.
    <img width="812" alt="image"
    src="https://github.com/databricks/databricks-sdk-py/assets/88345179/3f28bc3c-0ba9-41ac-b990-3c4f5bf138aa">
    
    The local (for people outside DBR) solution so far was to initialise
    spark manually. But this can be tedious for deeply nested libraries
    (which is the reason this import was introduced in the first place).
    
    Now, we make a best effort attempt to initialise maximum number of
    globals locally, so that users can build and debug libraries using
    databricks connect.
    
    ## Tests
    * integration test
    
    - [ ] `make test` run locally
    - [x] `make fmt` applied
    - [ ] relevant integration tests applied
    
    ---------
    
    Signed-off-by: Kartik Gupta <[email protected]>
    Co-authored-by: Miles Yucht <[email protected]>
    kartikgupta-db and mgyucht authored Feb 28, 2024
    Configuration menu
    Copy the full SHA
    f0fe023 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    9813180 View commit details
    Browse the repository at this point in the history

Commits on Mar 4, 2024

  1. New example to list compute resource for SUBMIT_RUN job runs (#572)

    ## Changes
    From a discussion with @mgyucht on a request for getting a list of jobs
    submitted via jobs/submit and the associated compute resource(s). There
    are multiple compute-related fields returned in the response (both on
    the run, and on the task) so this example aims to disambiguate them.
    
    Ran against AWS, Azure, GCP workspaces.
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    kimberlyma authored Mar 4, 2024
    Configuration menu
    Copy the full SHA
    a282eee View commit details
    Browse the repository at this point in the history

Commits on Mar 7, 2024

  1. Update SDK to latest OpenAPI spec (#576)

    ## Changes
    <!-- Summary of your changes that are easy to understand -->
    - Update SDK to OpenAPI spec
    - Also, fix naming convention, we use Pascal case (.PascalName) instead
    of Camel case (.Name). Because of this we now generate the following:
    `CSPEnablement` -> `CspEnablement`
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    Unit tests, integration test will run release PR. 
    - [ ] `make test` run locally
    - [ ] `make fmt` applied
    - [ ] relevant integration tests applied
    tanmay-db authored Mar 7, 2024
    Configuration menu
    Copy the full SHA
    e861140 View commit details
    Browse the repository at this point in the history
  2. Fix type issue with widgets.getArgument (#581)

    ## Changes
    <!-- Summary of your changes that are easy to understand -->
    - The `str | None` used previously doesn't work with older version of
    python, fixing it so that it unblocks integration test and release
    
    ## Tests
    <!-- 
    How is this tested? Please see the checklist below and also describe any
    other relevant tests
    -->
    
    - [x] `make test` run locally
    - [x] `make fmt` applied
    - [x] relevant integration tests applied
    edwardfeng-db authored Mar 7, 2024
    Configuration menu
    Copy the full SHA
    ddf6493 View commit details
    Browse the repository at this point in the history
  3. Release v0.21.0 (#578)

    ### New Features and Improvements
    * Fixed get_workspace_client in GCP
    ([#532](#532)).
    * Use all-apis scope with external-browser
    ([#563](#563)).
    * Make a best effort attempt to initialise all Databricks globals
    ([#562](#562)).
    * Fixed type issue with widgets.getArgument
    ([#581](#581))
    * Note: Backwards incompatible changes - Settings are now nested, please
    see the API changes below.
    
    ### Documentation
    * Added Files API docs to the SDK Documentation
    ([#556](#556)).
    * Added new example to list compute resource for SUBMIT_RUN job runs
    ([#572](#572)).
    * Sorted index pages by name in docs
    ([#560](#560)).
    * Added back enums to docs
    ([#557](#557)).
    
    ### API Changes
    #### Added
    Services:
    -
    [w.permission_migration](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/permission_migration.html)
    workspace-level service.
    -
    [w.settings.automatic_cluster_update](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/automatic_cluster_update.html)
    workspace-level service.
    -
    [w.settings.csp_enablement](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/csp_enablement.html)
    workspace-level service.
    -
    [a.settings.csp_enablement_account](https://databricks-sdk-py.readthedocs.io/en/latest/account/settings/csp_enablement_account.html)
    account-level service.
    -
    [w.settings.default_namespace](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/default_namespace.html)
    workspace-level service.
    -
    [w.settings.esm_enablement](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/esm_enablement.html)
    workspace-level service.
    -
    [a.settings.esm_enablement_account](https://databricks-sdk-py.readthedocs.io/en/latest/account/settings/esm_enablement_account.html)
    account-level service.
    -
    [a.settings.personal_compute](https://databricks-sdk-py.readthedocs.io/en/latest/account/settings/personal_compute.html)
    account-level service.
    -
    [w.settings.restrict_workspace_admins](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/restrict_workspace_admins.html)
    workspace-level service.
    
    Dataclasses:
    - `databricks.sdk.service.settings.AutomaticClusterUpdateSetting`
    - `databricks.sdk.service.settings.ClusterAutoRestartMessage`
    -
    `databricks.sdk.service.settings.ClusterAutoRestartMessageEnablementDetails`
    -
    `databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindow`
    -
    `databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowDayOfWeek`
    -
    `databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWeekDayBasedSchedule`
    -
    `databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWeekDayFrequency`
    -
    `databricks.sdk.service.settings.ClusterAutoRestartMessageMaintenanceWindowWindowStartTime`
    - `databricks.sdk.service.settings.ComplianceStandard`
    - `databricks.sdk.service.settings.CspEnablement`
    - `databricks.sdk.service.settings.CspEnablementAccount`
    - `databricks.sdk.service.settings.CspEnablementAccountSetting`
    - `databricks.sdk.service.settings.CspEnablementSetting`
    - `databricks.sdk.service.settings.DeleteDefaultNamespaceRequest`
    - `databricks.sdk.service.settings.DeletePersonalComputeRequest`
    - `databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminRequest`
    - `databricks.sdk.service.settings.EsmEnablement`
    - `databricks.sdk.service.settings.EsmEnablementAccount`
    - `databricks.sdk.service.settings.EsmEnablementAccountSetting`
    - `databricks.sdk.service.settings.EsmEnablementSetting`
    - `databricks.sdk.service.settings.GetAutomaticClusterUpdateRequest`
    - `databricks.sdk.service.settings.GetCspEnablementAccountRequest`
    - `databricks.sdk.service.settings.GetCspEnablementRequest`
    - `databricks.sdk.service.settings.GetDefaultNamespaceRequest`
    - `databricks.sdk.service.settings.GetEsmEnablementAccountRequest`
    - `databricks.sdk.service.settings.GetEsmEnablementRequest`
    - `databricks.sdk.service.settings.GetPersonalComputeRequest`
    - `databricks.sdk.service.settings.GetRestrictWorkspaceAdminRequest`
    - `databricks.sdk.service.settings.NccAwsStableIpRule`
    -
    `databricks.sdk.service.settings.UpdateAutomaticClusterUpdateSettingRequest`
    -
    `databricks.sdk.service.settings.UpdateCspEnablementAccountSettingRequest`
    - `databricks.sdk.service.settings.UpdateCspEnablementSettingRequest`
    -
    `databricks.sdk.service.settings.UpdateEsmEnablementAccountSettingRequest`
    - `databricks.sdk.service.settings.UpdateEsmEnablementSettingRequest`
    -
    `databricks.sdk.service.vectorsearch.ClusterAutoRestartMessageMaintenanceWindow`
    -
    `databricks.sdk.service.vectorsearch.ClusterAutoRestartMessageMaintenanceWindowDayOfWeek`
    -
    `databricks.sdk.service.vectorsearch.ClusterAutoRestartMessageMaintenanceWindowWeekDayBasedSchedule`
    -
    `databricks.sdk.service.vectorsearch.ClusterAutoRestartMessageMaintenanceWindowWeekDayFrequency`
    -
    `databricks.sdk.service.vectorsearch.ClusterAutoRestartMessageMaintenanceWindowWindowStartTime`
    - `databricks.sdk.service.vectorsearch.ComplianceStandard`
    - `databricks.sdk.service.vectorsearch.CspEnablement`
    - `databricks.sdk.service.vectorsearch.CspEnablementAccount`
    - `databricks.sdk.service.vectorsearch.CspEnablementAccountSetting`
    - `databricks.sdk.service.vectorsearch.CspEnablementSetting`
    - `databricks.sdk.service.vectorsearch.DeleteDefaultNamespaceRequest`
    - `databricks.sdk.service.vectorsearch.DeletePersonalComputeRequest`
    -
    `databricks.sdk.service.vectorsearch.DeleteRestrictWorkspaceAdminRequest`
    - `databricks.sdk.service.vectorsearch.EsmEnablement`
    - `databricks.sdk.service.vectorsearch.EsmEnablementAccount`
    - `databricks.sdk.service.vectorsearch.EsmEnablementAccountSetting`
    - `databricks.sdk.service.vectorsearch.EsmEnablementSetting`
    - `databricks.sdk.service.vectorsearch.GetAutomaticClusterUpdateRequest`
    - `databricks.sdk.service.vectorsearch.GetCspEnablementAccountRequest`
    - `databricks.sdk.service.vectorsearch.GetCspEnablementRequest`
    - `databricks.sdk.service.vectorsearch.GetDefaultNamespaceRequest`
    - `databricks.sdk.service.vectorsearch.GetEsmEnablementAccountRequest`
    - `databricks.sdk.service.vectorsearch.GetEsmEnablementRequest`
    - `databricks.sdk.service.vectorsearch.GetPersonalComputeRequest`
    - `databricks.sdk.service.vectorsearch.GetRestrictWorkspaceAdminRequest`
    - `databricks.sdk.service.vectorsearch.NccAwsStableIpRule`
    -
    `databricks.sdk.service.vectorsearch.UpdateAutomaticClusterUpdateSettingRequest`
    -
    `databricks.sdk.service.vectorsearch.UpdateCspEnablementAccountSettingRequest`
    -
    `databricks.sdk.service.vectorsearch.UpdateCspEnablementSettingRequest`
    -
    `databricks.sdk.service.vectorsearch.UpdateEsmEnablementAccountSettingRequest`
    -
    `databricks.sdk.service.vectorsearch.UpdateEsmEnablementSettingRequest`
    - `databricks.sdk.service.iam.PermissionMigrationRequest`
    - `databricks.sdk.service.iam.PermissionMigrationResponse` 
    
    #### Changed
    - `version` field for `databricks.sdk.service.serving.AppManifest` to
    `databricks.sdk.service.serving.AnyValue` dataclass.
    - `delete_endpoint()` method for
    [w.vector_search_endpoints](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_endpoints.html)
    workspace-level service with new required argument order.
    - `create_index()` method for
    [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html)
    workspace-level service with new required argument order.
    - `delete_data_vector_index()` method for
    [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html)
    workspace-level service with new required argument order.
    - `upsert_data_vector_index()` method for
    [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html)
    workspace-level service with new required argument order.
    - `endpoint_name` field for
    `databricks.sdk.service.vectorsearch.CreateVectorIndexRequest` to be
    required.
    
    #### Removed
    - `delete_personal_compute_setting()` method for
    [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html)
    account-level service.
    - `get_personal_compute_setting()` method for
    [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html)
    account-level service.
    - `update_personal_compute_setting()` method for
    [a.account_settings](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_settings.html)
    account-level service.
    - `delete_default_namespace_setting()` method for
    [w.settings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html)
    workspace-level service.
    - `delete_restrict_workspace_admins_setting()` method for
    [w.settings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html)
    workspace-level service.
    - `get_default_namespace_setting()` method for
    [w.settings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html)
    workspace-level service.
    - `get_restrict_workspace_admins_setting()` method for
    [w.settings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html)
    workspace-level service.
    - `update_default_namespace_setting()` method for
    [w.settings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html)
    workspace-level service.
    - `update_restrict_workspace_admins_setting()` method for
    [w.settings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings.html)
    workspace-level service.
    - `databricks.sdk.service.settings.DeleteDefaultNamespaceSettingRequest`
    dataclass.
    - `databricks.sdk.service.settings.DeletePersonalComputeSettingRequest`
    dataclass.
    -
    `databricks.sdk.service.settings.DeleteRestrictWorkspaceAdminsSettingRequest`
    dataclass.
    - `databricks.sdk.service.settings.GetDefaultNamespaceSettingRequest`
    dataclass.
    - `databricks.sdk.service.settings.GetPersonalComputeSettingRequest`
    dataclass.
    -
    `databricks.sdk.service.settings.GetRestrictWorkspaceAdminsSettingRequest`
    dataclass.
    - `databricks.sdk.service.vectorsearch.EmbeddingConfig` dataclass.
    - `embedding_config` field for
    `databricks.sdk.service.vectorsearch.EmbeddingSourceColumn`.
    - `name` field for
    `databricks.sdk.service.vectorsearch.DeleteDataVectorIndexRequest`.
    - `name` field for
    `databricks.sdk.service.vectorsearch.DeleteEndpointRequest`.
    - `planning_phases` field for `databricks.sdk.service.sql.QueryMetrics`.
    - `delta_sync_vector_index_spec` field for
    `databricks.sdk.service.vectorsearch.VectorIndex`.
    - `direct_access_vector_index_spec` field for
    `databricks.sdk.service.vectorsearch.VectorIndex`.
    
    ### Internal Changes
    * Added tokei.rs badge
    ([#567](#567)).
    * Update SDK to latest OpenAPI spec
    ([#576](#576)).
    * Add integration tests for Files API
    ([#552](#552)).
    * Fix integer deserialization for headers
    ([#553](#553)).
    * Support subservices
    ([#559](#559)).
    * Distinguish between empty types and fields that can take any value
    ([#561](#561)).
    
    OpenAPI SHA: 1026b998b14fba1b8317528f47778240dc4e9a5d, Date: 2024-03-06
    tanmay-db authored Mar 7, 2024
    Configuration menu
    Copy the full SHA
    898b57d View commit details
    Browse the repository at this point in the history
Loading