-
Notifications
You must be signed in to change notification settings - Fork 194
Comparing changes
Open a pull request
base repository: databricks/databricks-sdk-py
base: v0.27.0
head repository: databricks/databricks-sdk-py
compare: v0.29.0
- 16 commits
- 397 files changed
- 9 contributors
Commits on May 16, 2024
-
Fix null body response to empty in ApiClient (#579)
## Changes <!-- Summary of your changes that are easy to understand --> Some delete end points return null instead of empty body, we fix that in SDK Client since the APIs are not available publicly and support isn't available for them yet. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Thanks to @sodle-splunk for contributing the unit test (#641) - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied --------- Co-authored-by: Scott Odle <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 53e99a5 - Browse repository at this point
Copy the full SHA 53e99a5View commit details -
Configuration menu - View commit details
-
Copy full SHA for fabe7c4 - Browse repository at this point
Copy the full SHA fabe7c4View commit details
Commits on May 17, 2024
-
Better error message when private link enabled workspaces reject requ…
…ests (#647) ## Changes This PR ports databricks/databricks-sdk-go#924 to the Python SDK. When a user tries to access a Private Link-enabled workspace configured with no public internet access from a different network than the VPC endpoint belongs to, the Private Link backend redirects the user to the login page, rather than outright rejecting the request. The login page, however, is not a JSON document and cannot be parsed by the SDK, resulting in this error message: ``` $ databricks current-user me Error: unexpected error handling request: invalid character '<' looking for beginning of value. This is likely a bug in the Databricks SDK for Go or the underlying REST API. Please report this issue with the following debugging information to the SDK issue tracker at https://github.com/databricks/databricks-sdk-go/issues. Request log: GET /login.html?error=private-link-validation-error:<WSID> > * Host: > * Accept: application/json > * Authorization: REDACTED > * Referer: https://adb-<WSID>.azuredatabricks.net/api/2.0/preview/scim/v2/Me > * User-Agent: cli/0.0.0-dev+5ed10bb8ccc1 databricks-sdk-go/0.39.0 go/1.22.2 os/darwin cmd/current-user_me auth/pat < HTTP/2.0 200 OK < * Cache-Control: no-cache, no-store, must-revalidate < * Content-Security-Policy: default-src *; font-src * data:; frame-src * blob:; img-src * blob: data:; media-src * data:; object-src 'none'; style-src * 'unsafe-inline'; worker-src * blob:; script-src 'self' 'unsafe-eval' 'unsafe-hashes' 'report-sample' https://*.databricks.com https://databricks.github.io/debug-bookmarklet/ https://widget.intercom.io https://js.intercomcdn.com https://ajax.googleapis.com/ajax/libs/jquery/2.2.0/jquery.min.js https://databricks-ui-assets.azureedge.net https://ui-serving-cdn-testing.azureedge.net https://uiserviceprodwestus-cdn-endpoint.azureedge.net https://databricks-ui-infra.s3.us-west-2.amazonaws.com 'sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=' 'sha256-YOlue469P2BtTMZYUFLupA2aOUsgc6B/TDewH7/Qz7s=' 'sha256-Lh4yp7cr3YOJ3MOn6erNz3E3WI0JA20mWV+0RuuviFM=' 'sha256-0jMhpY6PB/BTRDLWtfcjdwiHOpE+6rFk3ohvY6fbuHU='; report-uri /ui-csp-reports; frame-ancestors *.vocareum.com *.docebosaas.com *.edx.org *.deloitte.com *.cloudlabs.ai *.databricks.com *.myteksi.net < * Content-Type: text/html; charset=utf-8 < * Date: Fri, 17 May 2024 07:47:38 GMT < * Server: databricks < * Set-Cookie: enable-armeria-workspace-server-for-ui-flags=false; Max-Age=1800; Expires=Fri, 17 May 2024 08:17:38 GMT; Secure; HTTPOnly; SameSite=Strict < * Strict-Transport-Security: max-age=31536000; includeSubDomains; preload < * X-Content-Type-Options: nosniff < * X-Ui-Svc: true < * X-Xss-Protection: 1; mode=block < <!doctype html> < <html> < <head> < <meta charset="utf-8"> < <meta http-equiv="Content-Language" content="en"> < <title>Databricks - Sign In</title> < <meta name="viewport" content="width=960"> < <link rel="icon" type="image/png" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <meta http-equiv="content-type" content="text/html; charset=UTF8"> < <script id="__databricks_react_script"></script> < <script>window.__DATABRICKS_SAFE_FLAGS__={"databricks.infra.showErrorModalOnFetchError":true,"databricks.fe.infra.useReact18":true,"databricks.fe.infra.useReact18NewAPI":false,"databricks.fe.infra.fixConfigPrefetch":true},window.__DATABRICKS_CONFIG__={"publicPath":{"mlflow":"https://databricks-ui-assets.azureedge.net/","dbsql":"https://databricks-ui-assets.azureedge.net/","feature-store":"https://databricks-ui-assets.azureedge.net/","monolith":"https://databricks-ui-assets.azureedge.net/","jaws":"https://databricks-ui-assets.azureedge.net/"}}</script> < <link rel="icon" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> < <script> < function setNoCdnAndReload() { < document.cookie = `x-databricks-cdn-inaccessible=true; path=/; max-age=86400`; < const metric = 'cdnFallbackOccurred'; < const browserUserAgent = navigator.userAgent; < const browserTabId = window.browserTabId; < const performanceEntry = performance.getEntriesByType('resource').filter(e => e.initiatorType === 'script').slice(-1)[0] < sessionStorage.setItem('databricks-cdn-fallback-telemetry-key', JSON.stringify({ tags: { browserUserAgent, browserTabId }, performanceEntry})); < window.location.reload(); < } < </script> < <script> < // Set a manual timeout for dropped packets to CDN < function loadScriptWithTimeout(src, timeout) { < return new Promise((resolve, reject) => { < const script = document.createElement('script'); < script.defer = true; < script.src = src; < script.onload = resolve; < script.onerror = reject; < document.head.appendChild(script); < setTimeout(() => { < reject(new Error('Script load timeout')); < }, timeout); < }); < } < loadScriptWithTimeout('https://databricks-ui-assets.azureedge.net/static/js/login/login.8a983ca2.js', 10000).catch(setNoCdnAndReload); < </script> < </head> < <body class="light-mode"> < <uses-legacy-bootstrap> < <div id="login-page"></div> < </uses-legacy-bootstrap> < </body> < </html> ``` To address this, I add one additional check in the error mapper logic to inspect whether the user was redirected to the login page with the private link validation error response code. If so, we return a custom error, `PrivateLinkValidationError`, with error code `PRIVATE_LINK_VALIDATION_ERROR` that inherits from PermissionDenied and has a mock 403 status code. After this change, users will see an error message like this: ``` databricks.sdk.errors.private_link.PrivateLinkValidationError: The requested workspace has Azure Private Link enabled and is not accessible from the current network. Ensure that Azure Private Link is properly configured and that your device has access to the Azure Private Link endpoint. For more information, see https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#authentication-troubleshooting. ``` The error message is tuned to the specific cloud so that we can redirect users to the appropriate documentation, the cloud being inferred from the request URI. ## Tests Unit tests cover the private link error message mapping. To manually test this, I created a private link workspace in Azure, created an access token, restricted access to the workspace, then ran the `last_job_runs.py` example using the host & token: ``` /Users/miles/databricks-cli/.venv/bin/python /Users/miles/databricks-sdk-py/examples/last_job_runs.py 2024-05-17 11:43:32,529 [databricks.sdk][INFO] loading DEFAULT profile from ~/.databrickscfg: host, token Traceback (most recent call last): File "/Users/miles/databricks-sdk-py/examples/last_job_runs.py", line 20, in <module> for job in w.jobs.list(): File "/Users/miles/databricks-sdk-py/databricks/sdk/service/jobs.py", line 5453, in list json = self._api.do('GET', '/api/2.1/jobs/list', query=query, headers=headers) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/miles/databricks-sdk-py/databricks/sdk/core.py", line 131, in do response = retryable(self._perform)(method, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/miles/databricks-sdk-py/databricks/sdk/retries.py", line 54, in wrapper raise err File "/Users/miles/databricks-sdk-py/databricks/sdk/retries.py", line 33, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/miles/databricks-sdk-py/databricks/sdk/core.py", line 245, in _perform raise self._make_nicer_error(response=response) from None databricks.sdk.errors.private_link.PrivateLinkValidationError: The requested workspace has Azure Private Link enabled and is not accessible from the current network. Ensure that Azure Private Link is properly configured and that your device has access to the Azure Private Link endpoint. For more information, see https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#authentication-troubleshooting. ``` - [ ] `make test` run locally - [ ] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for b13042b - Browse repository at this point
Copy the full SHA b13042bView commit details
Commits on May 21, 2024
-
Create a method to generate OAuth tokens (#644)
## Changes Add method to get OAuth tokens ## Tests - [X] `make test` run locally - [X] `make fmt` applied - [ ] relevant integration tests applied - [X] Manual test (cannot be run as integration tests due to limitations in the current infrastructure setup) ``` def test(): w = WorkspaceClient(profile='DEFAULT') auth_details = f'"type":"workspace_permission","object_type":"serving-endpoints","object_path":"/serving-endpoints/REDACTED","actions":["query_inference_endpoint"]' auth_details = "[{" + auth_details + "}]" t = w.api_client.get_oauth_token(auth_details) print(t) ``` Result: ``` Token(access_token='REDACTED', token_type='Bearer', refresh_token=None, expiry=datetime.datetime(2024, 5, 16, 11, 9, 8, 221008)) ```Configuration menu - View commit details
-
Copy full SHA for db1f4ae - Browse repository at this point
Copy the full SHA db1f4aeView commit details
Commits on May 23, 2024
-
Revert "Create a method to generate OAuth tokens (#644)" (#653)
## Changes This reverts commit db1f4ae. Azure CredentialsProvider is impacted by the change: https://github.com/databricks-eng/eng-dev-ecosystem/actions/runs/9187156296/job/25264245409 Reverting to unblock the release ## Tests - [X] `make test` run locally - [X] `make fmt` applied - [X] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 9c5fae7 - Browse repository at this point
Copy the full SHA 9c5fae7View commit details -
### Improvements and new features * Better error message when private link enabled workspaces reject requests ([#647](#647)). ### API Changes * Renamed [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service to [w.quality_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/quality_monitors.html) . * Added `databricks.sdk.service.vectorsearch.ListValue` dataclass. * Added `databricks.sdk.service.vectorsearch.MapStringValueEntry` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexRequest` dataclass. * Added `databricks.sdk.service.vectorsearch.ScanVectorIndexResponse` dataclass. * Added `databricks.sdk.service.vectorsearch.Struct` dataclass. * Added `databricks.sdk.service.vectorsearch.Value` dataclass. * Added `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. * Added `databricks.sdk.service.catalog.MonitorRefreshListResponse` dataclass. * Added `databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfig` dataclass. * Added `databricks.sdk.service.pipelines.TableSpecificConfigScdType` dataclass. * Added `databricks.sdk.service.serving.AppDeploymentArtifacts` dataclass. * Removed `databricks.sdk.service.catalog.EnableSchemaName` dataclass. * Removed `databricks.sdk.service.catalog.DisableSchemaName` dataclass. * Removed `databricks.sdk.service.marketplace.SortBySpec` dataclass. * Removed `databricks.sdk.service.marketplace.SortOrder` dataclass. * Renamed `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.DeleteQualityMonitorRequest`. * Renamed `databricks.sdk.service.catalog.GetLakehouseMonitorRequest` dataclass to `databricks.sdk.service.catalog.GetQualityMonitorRequest`. * Added `next_page_token` field for `databricks.sdk.service.catalog.ListConnectionsResponse`. * Added `dashboard_id` field for `databricks.sdk.service.catalog.UpdateMonitor`. * Added `is_ascending` and `sort_by` fields for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `is_ascending` field for `databricks.sdk.service.marketplace.SearchListingsRequest`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.CreatePipeline`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.EditPipeline`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`. * Added `gateway_definition` field for `databricks.sdk.service.pipelines.PipelineSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.SchemaSpec`. * Added `table_configuration` field for `databricks.sdk.service.pipelines.TableSpec`. * Added `deployment_artifacts` field for `databricks.sdk.service.serving.AppDeployment`. * Added `route_optimized` field for `databricks.sdk.service.serving.CreateServingEndpoint`. * Added `contents` field for `databricks.sdk.service.serving.ExportMetricsResponse`. * Added `microsoft_entra_client_id`, `microsoft_entra_client_secret` and `microsoft_entra_tenant_id` fields for `databricks.sdk.service.serving.OpenAiConfig`. * Added `endpoint_url` and `route_optimized` fields for `databricks.sdk.service.serving.ServingEndpointDetailed`. * Added `storage_root` field for `databricks.sdk.service.sharing.CreateShare`. * Added `storage_location` and `storage_root` fields for `databricks.sdk.service.sharing.ShareInfo`. * Added `storage_root` field for `databricks.sdk.service.sharing.UpdateShare`. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecRequest`. * Added `embedding_writeback_table` field for `databricks.sdk.service.vectorsearch.DeltaSyncVectorIndexSpecResponse`. * Changed `schema_name` field for `databricks.sdk.service.catalog.DisableRequest` to `str` dataclass. * Changed `schema_name` field for `databricks.sdk.service.catalog.EnableRequest` to `str` dataclass. * Changed `cluster_status()` method for [w.libraries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/libraries.html) workspace-level service to return `databricks.sdk.service.compute.ClusterLibraryStatuses` dataclass. * Changed `spec` and `cluster_source` fields for `databricks.sdk.service.compute.ClusterDetails` to `databricks.sdk.service.compute.ClusterSpec` dataclass. * Changed `openai_api_key` field for `databricks.sdk.service.serving.OpenAiConfig` to no longer be required. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterAttributes`. * Removed `cluster_source` field for `databricks.sdk.service.compute.ClusterSpec`. * Removed `databricks.sdk.service.compute.ClusterStatusResponse` dataclass. * Removed `cluster_source` field for `databricks.sdk.service.compute.CreateCluster`. * Removed `clone_from` and `cluster_source` fields for `databricks.sdk.service.compute.EditCluster`. * Removed `sort_by_spec` field for `databricks.sdk.service.marketplace.ListListingsRequest`. * Added `scan_index()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service. * Changed `list()` method for [w.connections](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/connections.html) workspace-level service to require request of `databricks.sdk.service.catalog.ListConnectionsRequest` dataclass. OpenAPI SHA: 7eb5ad9a2ed3e3f1055968a2d1014ac92c06fe92, Date: 2024-05-21
Configuration menu - View commit details
-
Copy full SHA for 3a9b82a - Browse repository at this point
Copy the full SHA 3a9b82aView commit details
Commits on May 24, 2024
-
Revert "Revert "Create a method to generate OAuth tokens (#644)" (#653)…
Configuration menu - View commit details
-
Copy full SHA for 2bafb95 - Browse repository at this point
Copy the full SHA 2bafb95View commit details
Commits on May 29, 2024
-
Patch
dbutils.notebook.entry_point...to return current local noteb……ook path from env var (#618) ## Changes * Users are using `dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPath().get()` to get the current notebook path in notebooks (referring https://stackoverflow.com/questions/53523560/databricks-how-do-i-get-path-of-current-notebook). * This does not work from our current dbutils proxy. This PR patches this to read from a DATABRICKS_SOURCE_FILE env var when using local dbutils. ## Proposal * Allow users to add their own patches to make is easier for users using dbutils from sdk to workaround such issues in the future. ## Tests * Unit tests - [x] `make test` run locally - [x] `make fmt` applied - [x] relevant integration tests applied --------- Signed-off-by: Kartik Gupta <[email protected]> Co-authored-by: Miles Yucht <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 7f71835 - Browse repository at this point
Copy the full SHA 7f71835View commit details
Commits on May 31, 2024
-
Ignore DataPlane Services during generation (#663)
## Changes Ignore DataPlane Services during generation until proper support is implemented ## Tests Generated SDK using local universe (which contains DataPlane services) - [X] `make test` passing - [X] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for a714146 - Browse repository at this point
Copy the full SHA a714146View commit details
Commits on Jun 4, 2024
-
## Changes Notable changes: * Pagination of account-level storage credentials * Add batch get APIs for marketplace listings * Rename app deployment method ## Tests - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for a6e3710 - Browse repository at this point
Copy the full SHA a6e3710View commit details
Commits on Jun 7, 2024
-
Fixed Interactive OAuth on Azure & updated documentations (#669)
## Changes - Removed Azure check in interactive OAuth - Updated Flask with OAuth example - Updated OAuth documentation ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 455a14c - Browse repository at this point
Copy the full SHA 455a14cView commit details
Commits on Jun 12, 2024
-
Retry failed integration tests (#674)
## Changes Retry failed integration tests ## Tests - [X] `make test` run locally - [X] `make fmt` applied - [X] relevant integration tests applied - [X] `make dev integration` applied
Configuration menu - View commit details
-
Copy full SHA for f50af60 - Browse repository at this point
Copy the full SHA f50af60View commit details
Commits on Jun 17, 2024
-
Fix documentation examples (#676)
## Changes Split examples in workspace and account examples to avoid workspace examples showing in account services which share a name. ## Tests - [X] Generated documentation - [X] `make test` run locally - [X] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 9eaaf1e - Browse repository at this point
Copy the full SHA 9eaaf1eView commit details
Commits on Jun 19, 2024
-
Add
serverless_compute_idfield to the config (#685)## Changes We propose adding a new `serverless_compute_id` attribute in the config that will be used to enable serverless compute in the downstream applications. The default value to enable serverless is `serverless_compute_id = auto`, other values might be used in the future if needed. Config doesn't have to validate the value in `serverless_compute_id`, this will be done in the downstream applications. ## Tests - [x] `make test` run locally - [x] `make fmt` applied - [ ] relevant integration tests applied
Configuration menu - View commit details
-
Copy full SHA for 4e751f8 - Browse repository at this point
Copy the full SHA 4e751f8View commit details
Commits on Jun 20, 2024
-
Added
with_product(...)andwith_user_agent_extra(...)public fun……ctions to improve telemetry for mid-stream libraries (#679) ## Changes This PR solves three tracking challenges: 1. Adds consistent mechanism with Go SDK Public API. See https://github.com/databricks/databricks-sdk-go/blob/00b1d09b24aa9fb971bcf23f3db3e80bf2bec6fe/useragent/user_agent.go#L20-L31 and https://github.com/databricks/databricks-sdk-go/blob/00b1d09b24aa9fb971bcf23f3db3e80bf2bec6fe/useragent/user_agent.go#L49-L54 2. Some of our products, like UCX and Remorph, are used not only as standalone CLI, but also as mid-stream libraries in other products, where developers don't specify product name and product version within the WorkspaceClient. This results in missing tracking information for those integrations. 3. Mid-stream libraries, like blueprint, pytester, and lsql, do have their own versions, but they don't create manage sdk.WorkspaceClient and/or core.Config themselves, so we currently lack traffic attribution for those libraries. Technically, Databricks Connect falls into the same use-case. ## Tests Moved unit tests that are relevant to User-Agent verification from test_core.py to test_config.py to bring back consistency.
Configuration menu - View commit details
-
Copy full SHA for 58574b8 - Browse repository at this point
Copy the full SHA 58574b8View commit details
Commits on Jun 24, 2024
-
### Breaking Changes * Create a method to generate OAuth tokens ([#644](#644)) **NOTE**: this change renames `@credentials_provider`/`CredentialsProvider` to `@credentials_strategy`/`CredentialsStrategy`. ### Improvements and Bug Fixes * Patch `dbutils.notebook.entry_point...` to return current local notebook path from env var ([#618](#618)). * Add `serverless_compute_id` field to the config ([#685](#685)). * Added `with_product(...)` and `with_user_agent_extra(...)` public functions to improve telemetry for mid-stream libraries ([#679](#679)). * Fixed Interactive OAuth on Azure & updated documentations ([#669](#669)). ### Documentation * Fix documentation examples ([#676](#676)). ### Internal Changes * Ignore DataPlane Services during generation ([#663](#663)). * Update OpenAPI spec ([#667](#667)). * Retry failed integration tests ([#674](#674)). ### API Changes * Changed `list()` method for [a.account_storage_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html) account-level service to return `databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` dataclass. * Changed `isolation_mode` field for `databricks.sdk.service.catalog.CatalogInfo` to `databricks.sdk.service.catalog.CatalogIsolationMode` dataclass. * Added `isolation_mode` field for `databricks.sdk.service.catalog.ExternalLocationInfo`. * Added `max_results` and `page_token` fields for `databricks.sdk.service.catalog.ListCatalogsRequest`. * Added `next_page_token` field for `databricks.sdk.service.catalog.ListCatalogsResponse`. * Added `table_serving_url` field for `databricks.sdk.service.catalog.OnlineTable`. * Added `isolation_mode` field for `databricks.sdk.service.catalog.StorageCredentialInfo`. * Changed `isolation_mode` field for `databricks.sdk.service.catalog.UpdateCatalog` to `databricks.sdk.service.catalog.CatalogIsolationMode` dataclass. * Added `isolation_mode` field for `databricks.sdk.service.catalog.UpdateExternalLocation`. * Added `isolation_mode` field for `databricks.sdk.service.catalog.UpdateStorageCredential`. * Added `databricks.sdk.service.catalog.CatalogIsolationMode` and `databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` dataclasses. * Added `create_schedule()`, `create_subscription()`, `delete_schedule()`, `delete_subscription()`, `get_schedule()`, `get_subscription()`, `list()`, `list_schedules()`, `list_subscriptions()` and `update_schedule()` methods for [w.lakeview](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakeview.html) workspace-level service. * Added `databricks.sdk.service.dashboards.CreateScheduleRequest`, `databricks.sdk.service.dashboards.CreateSubscriptionRequest`, `databricks.sdk.service.dashboards.CronSchedule`, `databricks.sdk.service.dashboards.DashboardView`, `databricks.sdk.service.dashboards.DeleteScheduleRequest`, `databricks.sdk.service.dashboards.DeleteSubscriptionRequest` dataclass, `databricks.sdk.service.dashboards.GetScheduleRequest`, `databricks.sdk.service.dashboards.GetSubscriptionRequest`, `databricks.sdk.service.dashboards.ListDashboardsRequest`, `databricks.sdk.service.dashboards.ListDashboardsResponse`, `databricks.sdk.service.dashboards.ListSchedulesRequest`, `databricks.sdk.service.dashboards.ListSchedulesResponse`, `databricks.sdk.service.dashboards.ListSubscriptionsRequest`, `databricks.sdk.service.dashboards.ListSubscriptionsResponse`, `databricks.sdk.service.dashboards.Schedule`, `databricks.sdk.service.dashboards.SchedulePauseStatus`, `databricks.sdk.service.dashboards.Subscriber`, `databricks.sdk.service.dashboards.Subscription`, `databricks.sdk.service.dashboards.SubscriptionSubscriberDestination`, `databricks.sdk.service.dashboards.SubscriptionSubscriberUser` and `databricks.sdk.service.dashboards.UpdateScheduleRequest` dataclasses. * Added `termination_category` field for `databricks.sdk.service.jobs.ForEachTaskErrorMessageStats`. * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`. * Added `environment_key` field for `databricks.sdk.service.jobs.RunTask`. * Removed `condition_task`, `dbt_task`, `notebook_task`, `pipeline_task`, `python_wheel_task`, `run_job_task`, `spark_jar_task`, `spark_python_task`, `spark_submit_task` and `sql_task` fields for `databricks.sdk.service.jobs.SubmitRun`. * Added `environments` field for `databricks.sdk.service.jobs.SubmitRun`. * Added `dbt_task` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `environment_key` field for `databricks.sdk.service.jobs.SubmitTask`. * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`. * Added `periodic` field for `databricks.sdk.service.jobs.TriggerSettings`. * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`. * Added `databricks.sdk.service.jobs.PeriodicTriggerConfiguration` dataclass. * Added `databricks.sdk.service.jobs.PeriodicTriggerConfigurationTimeUnit` dataclass. * Added `batch_get()` method for [w.consumer_listings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/consumer_listings.html) workspace-level service. * Added `batch_get()` method for [w.consumer_providers](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/consumer_providers.html) workspace-level service. * Added `provider_summary` field for `databricks.sdk.service.marketplace.Listing`. * Added `databricks.sdk.service.marketplace.BatchGetListingsRequest`, `databricks.sdk.service.marketplace.BatchGetListingsResponse`, `databricks.sdk.service.marketplace.BatchGetProvidersRequest`, `databricks.sdk.service.marketplace.BatchGetProvidersResponse`, `databricks.sdk.service.marketplace.ProviderIconFile`, `databricks.sdk.service.marketplace.ProviderIconType`, `databricks.sdk.service.marketplace.ProviderListingSummaryInfo` and `databricks.sdk.service.oauth2.DataPlaneInfo` dataclasses. * Removed `create_deployment()` method for [w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html) workspace-level service. * Added `deploy()` and `start()` method1 for [w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html) workspace-level service. * Added [w.serving_endpoints_data_plane](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints_data_plane.html) workspace-level service. * Added `service_principal_id` and `service_principal_name` fields for `databricks.sdk.service.serving.App`. * Added `mode` field for `databricks.sdk.service.serving.AppDeployment`. * Added `mode` field for `databricks.sdk.service.serving.CreateAppDeploymentRequest`. * Added `data_plane_info` field for `databricks.sdk.service.serving.ServingEndpointDetailed`. * Added `databricks.sdk.service.serving.AppDeploymentMode`, `databricks.sdk.service.serving.ModelDataPlaneInfo` and `databricks.sdk.service.serving.StartAppRequest` dataclasses. * Added `query_next_page()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service. * Added `query_type` field for `databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`. * Added `next_page_token` field for `databricks.sdk.service.vectorsearch.QueryVectorIndexResponse`. OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-24
Configuration menu - View commit details
-
Copy full SHA for 228cc8f - Browse repository at this point
Copy the full SHA 228cc8fView commit details
This comparison is taking too long to generate.
Unfortunately it looks like we can’t render this comparison for you right now. It might be too big, or there might be something weird with your repository.
You can try running this command locally to see the comparison on your machine:
git diff v0.27.0...v0.29.0