Databricks SQL¶
These dataclasses are used in the SDK to represent API requests and responses for services in the databricks.sdk.service.sql module.
- class databricks.sdk.service.sql.AccessControl(group_name: 'Optional[str]' = None, permission_level: 'Optional[PermissionLevel]' = None, user_name: 'Optional[str]' = None)¶
- group_name: str | None = None¶
- permission_level: PermissionLevel | None = None¶
CAN_VIEW: Can view the query * CAN_RUN: Can run the query * CAN_EDIT: Can edit the query
CAN_MANAGE: Can manage the query
- user_name: str | None = None¶
- as_dict() dict¶
Serializes the AccessControl into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AccessControl into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AccessControl¶
Deserializes the AccessControl from a dictionary.
- class databricks.sdk.service.sql.Aggregation¶
- AVG = "AVG"¶
- COUNT = "COUNT"¶
- COUNT_DISTINCT = "COUNT_DISTINCT"¶
- MAX = "MAX"¶
- MEDIAN = "MEDIAN"¶
- MIN = "MIN"¶
- STDDEV = "STDDEV"¶
- SUM = "SUM"¶
- class databricks.sdk.service.sql.Alert(condition: 'Optional[AlertCondition]' = None, create_time: 'Optional[str]' = None, custom_body: 'Optional[str]' = None, custom_subject: 'Optional[str]' = None, display_name: 'Optional[str]' = None, id: 'Optional[str]' = None, lifecycle_state: 'Optional[LifecycleState]' = None, notify_on_ok: 'Optional[bool]' = None, owner_user_name: 'Optional[str]' = None, parent_path: 'Optional[str]' = None, query_id: 'Optional[str]' = None, seconds_to_retrigger: 'Optional[int]' = None, state: 'Optional[AlertState]' = None, trigger_time: 'Optional[str]' = None, update_time: 'Optional[str]' = None)¶
- condition: AlertCondition | None = None¶
Trigger conditions of the alert.
- create_time: str | None = None¶
The timestamp indicating when the alert was created.
- custom_body: str | None = None¶
Custom body of alert notification, if it exists. See [here] for custom templating instructions.
- custom_subject: str | None = None¶
Custom subject of alert notification, if it exists. This can include email subject entries and Slack notification headers, for example. See [here] for custom templating instructions.
- display_name: str | None = None¶
The display name of the alert.
- id: str | None = None¶
UUID identifying the alert.
- lifecycle_state: LifecycleState | None = None¶
The workspace state of the alert. Used for tracking trashed status.
- notify_on_ok: bool | None = None¶
Whether to notify alert subscribers when alert returns back to normal.
- owner_user_name: str | None = None¶
The owner’s username. This field is set to “Unavailable” if the user has been deleted.
- parent_path: str | None = None¶
The workspace path of the folder containing the alert.
- query_id: str | None = None¶
UUID of the query attached to the alert.
- seconds_to_retrigger: int | None = None¶
Number of seconds an alert must wait after being triggered to rearm itself. After rearming, it can be triggered again. If 0 or not specified, the alert will not be triggered again.
- state: AlertState | None = None¶
Current state of the alert’s trigger status. This field is set to UNKNOWN if the alert has not yet been evaluated or ran into an error during the last evaluation.
- trigger_time: str | None = None¶
Timestamp when the alert was last triggered, if the alert has been triggered before.
- update_time: str | None = None¶
The timestamp indicating when the alert was updated.
- as_dict() dict¶
Serializes the Alert into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Alert into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.AlertCondition(empty_result_state: 'Optional[AlertState]' = None, op: 'Optional[AlertOperator]' = None, operand: 'Optional[AlertConditionOperand]' = None, threshold: 'Optional[AlertConditionThreshold]' = None)¶
- empty_result_state: AlertState | None = None¶
Alert state if result is empty.
- op: AlertOperator | None = None¶
Operator used for comparison in alert evaluation.
- operand: AlertConditionOperand | None = None¶
Name of the column from the query result to use for comparison in alert evaluation.
- threshold: AlertConditionThreshold | None = None¶
Threshold value used for comparison in alert evaluation.
- as_dict() dict¶
Serializes the AlertCondition into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertCondition into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertCondition¶
Deserializes the AlertCondition from a dictionary.
- class databricks.sdk.service.sql.AlertConditionOperand(column: 'Optional[AlertOperandColumn]' = None)¶
- column: AlertOperandColumn | None = None¶
- as_dict() dict¶
Serializes the AlertConditionOperand into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertConditionOperand into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertConditionOperand¶
Deserializes the AlertConditionOperand from a dictionary.
- class databricks.sdk.service.sql.AlertConditionThreshold(value: 'Optional[AlertOperandValue]' = None)¶
- value: AlertOperandValue | None = None¶
- as_dict() dict¶
Serializes the AlertConditionThreshold into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertConditionThreshold into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertConditionThreshold¶
Deserializes the AlertConditionThreshold from a dictionary.
- class databricks.sdk.service.sql.AlertEvaluationState¶
UNSPECIFIED - default unspecify value for proto enum, do not use it in the code UNKNOWN - alert not yet evaluated TRIGGERED - alert is triggered OK - alert is not triggered ERROR - alert evaluation failed
- ERROR = "ERROR"¶
- OK = "OK"¶
- TRIGGERED = "TRIGGERED"¶
- UNKNOWN = "UNKNOWN"¶
- class databricks.sdk.service.sql.AlertOperandColumn(name: 'Optional[str]' = None)¶
- name: str | None = None¶
- as_dict() dict¶
Serializes the AlertOperandColumn into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertOperandColumn into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertOperandColumn¶
Deserializes the AlertOperandColumn from a dictionary.
- class databricks.sdk.service.sql.AlertOperandValue(bool_value: 'Optional[bool]' = None, double_value: 'Optional[float]' = None, string_value: 'Optional[str]' = None)¶
- bool_value: bool | None = None¶
- double_value: float | None = None¶
- string_value: str | None = None¶
- as_dict() dict¶
Serializes the AlertOperandValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertOperandValue into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertOperandValue¶
Deserializes the AlertOperandValue from a dictionary.
- class databricks.sdk.service.sql.AlertOperator¶
- EQUAL = "EQUAL"¶
- GREATER_THAN = "GREATER_THAN"¶
- GREATER_THAN_OR_EQUAL = "GREATER_THAN_OR_EQUAL"¶
- IS_NULL = "IS_NULL"¶
- LESS_THAN = "LESS_THAN"¶
- LESS_THAN_OR_EQUAL = "LESS_THAN_OR_EQUAL"¶
- NOT_EQUAL = "NOT_EQUAL"¶
- class databricks.sdk.service.sql.AlertOptions(column: str, op: str, value: Any, custom_body: str | None = None, custom_subject: str | None = None, empty_result_state: AlertOptionsEmptyResultState | None = None, muted: bool | None = None)¶
Alert configuration options.
- column: str¶
Name of column in the query result to compare in alert evaluation.
- op: str¶
Operator used to compare in alert evaluation: >, >=, <, <=, ==, !=
- value: Any¶
Value used to compare in alert evaluation. Supported types include strings (eg. ‘foobar’), floats (eg. 123.4), and booleans (true).
- custom_body: str | None = None¶
Custom body of alert notification, if it exists. See [here] for custom templating instructions.
- custom_subject: str | None = None¶
Custom subject of alert notification, if it exists. This includes email subject, Slack notification header, etc. See [here] for custom templating instructions.
- empty_result_state: AlertOptionsEmptyResultState | None = None¶
State that alert evaluates to when query result is empty.
- muted: bool | None = None¶
Whether or not the alert is muted. If an alert is muted, it will not notify users and notification destinations when triggered.
- as_dict() dict¶
Serializes the AlertOptions into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertOptions into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertOptions¶
Deserializes the AlertOptions from a dictionary.
- class databricks.sdk.service.sql.AlertOptionsEmptyResultState¶
State that alert evaluates to when query result is empty.
- OK = "OK"¶
- TRIGGERED = "TRIGGERED"¶
- UNKNOWN = "UNKNOWN"¶
- class databricks.sdk.service.sql.AlertQuery(created_at: 'Optional[str]' = None, data_source_id: 'Optional[str]' = None, description: 'Optional[str]' = None, id: 'Optional[str]' = None, is_archived: 'Optional[bool]' = None, is_draft: 'Optional[bool]' = None, is_safe: 'Optional[bool]' = None, name: 'Optional[str]' = None, options: 'Optional[QueryOptions]' = None, query: 'Optional[str]' = None, tags: 'Optional[List[str]]' = None, updated_at: 'Optional[str]' = None, user_id: 'Optional[int]' = None)¶
- created_at: str | None = None¶
The timestamp when this query was created.
- data_source_id: str | None = None¶
Data source ID maps to the ID of the data source used by the resource and is distinct from the warehouse ID. [Learn more]
[Learn more]: https://docs.databricks.com/api/workspace/datasources/list
- description: str | None = None¶
General description that conveys additional information about this query such as usage notes.
- id: str | None = None¶
Query ID.
- is_archived: bool | None = None¶
Indicates whether the query is trashed. Trashed queries can’t be used in dashboards, or appear in search results. If this boolean is true, the options property for this query includes a moved_to_trash_at timestamp. Trashed queries are permanently deleted after 30 days.
- is_draft: bool | None = None¶
Whether the query is a draft. Draft queries only appear in list views for their owners. Visualizations from draft queries cannot appear on dashboards.
- is_safe: bool | None = None¶
Text parameter types are not safe from SQL injection for all types of data source. Set this Boolean parameter to true if a query either does not use any text type parameters or uses a data source type where text type parameters are handled safely.
- name: str | None = None¶
The title of this query that appears in list views, widget headings, and on the query page.
- options: QueryOptions | None = None¶
- query: str | None = None¶
The text of the query to be run.
- tags: List[str] | None = None¶
- updated_at: str | None = None¶
The timestamp at which this query was last updated.
- user_id: int | None = None¶
The ID of the user who owns the query.
- as_dict() dict¶
Serializes the AlertQuery into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertQuery into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertQuery¶
Deserializes the AlertQuery from a dictionary.
- class databricks.sdk.service.sql.AlertState¶
- OK = "OK"¶
- TRIGGERED = "TRIGGERED"¶
- UNKNOWN = "UNKNOWN"¶
- class databricks.sdk.service.sql.AlertV2(display_name: 'str', query_text: 'str', warehouse_id: 'str', evaluation: 'AlertV2Evaluation', schedule: 'CronSchedule', create_time: 'Optional[str]' = None, custom_description: 'Optional[str]' = None, custom_summary: 'Optional[str]' = None, effective_run_as: 'Optional[AlertV2RunAs]' = None, id: 'Optional[str]' = None, lifecycle_state: 'Optional[AlertLifecycleState]' = None, owner_user_name: 'Optional[str]' = None, parent_path: 'Optional[str]' = None, run_as: 'Optional[AlertV2RunAs]' = None, run_as_user_name: 'Optional[str]' = None, update_time: 'Optional[str]' = None)¶
- display_name: str¶
The display name of the alert.
- query_text: str¶
Text of the query to be run.
- warehouse_id: str¶
ID of the SQL warehouse attached to the alert.
- evaluation: AlertV2Evaluation¶
- schedule: CronSchedule¶
- create_time: str | None = None¶
The timestamp indicating when the alert was created.
- custom_description: str | None = None¶
Custom description for the alert. support mustache template.
- custom_summary: str | None = None¶
Custom summary for the alert. support mustache template.
- effective_run_as: AlertV2RunAs | None = None¶
The actual identity that will be used to execute the alert. This is an output-only field that shows the resolved run-as identity after applying permissions and defaults.
- id: str | None = None¶
UUID identifying the alert.
- lifecycle_state: AlertLifecycleState | None = None¶
Indicates whether the query is trashed.
- owner_user_name: str | None = None¶
The owner’s username. This field is set to “Unavailable” if the user has been deleted.
- parent_path: str | None = None¶
The workspace path of the folder containing the alert. Can only be set on create, and cannot be updated.
- run_as: AlertV2RunAs | None = None¶
Specifies the identity that will be used to run the alert. This field allows you to configure alerts to run as a specific user or service principal. - For user identity: Set user_name to the email of an active workspace user. Users can only set this to their own email. - For service principal: Set service_principal_name to the application ID. Requires the servicePrincipal/user role. If not specified, the alert will run as the request user.
- run_as_user_name: str | None = None¶
The run as username or application ID of service principal. On Create and Update, this field can be set to application ID of an active service principal. Setting this field requires the servicePrincipal/user role. Deprecated: Use run_as field instead. This field will be removed in a future release.
- update_time: str | None = None¶
The timestamp indicating when the alert was updated.
- as_dict() dict¶
Serializes the AlertV2 into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2 into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.AlertV2Evaluation(source: 'AlertV2OperandColumn', comparison_operator: 'ComparisonOperator', empty_result_state: 'Optional[AlertEvaluationState]' = None, last_evaluated_at: 'Optional[str]' = None, notification: 'Optional[AlertV2Notification]' = None, state: 'Optional[AlertEvaluationState]' = None, threshold: 'Optional[AlertV2Operand]' = None)¶
- source: AlertV2OperandColumn¶
Source column from result to use to evaluate alert
- comparison_operator: ComparisonOperator¶
Operator used for comparison in alert evaluation.
- empty_result_state: AlertEvaluationState | None = None¶
Alert state if result is empty. Please avoid setting this field to be UNKNOWN because UNKNOWN state is planned to be deprecated.
- last_evaluated_at: str | None = None¶
Timestamp of the last evaluation.
- notification: AlertV2Notification | None = None¶
User or Notification Destination to notify when alert is triggered.
- state: AlertEvaluationState | None = None¶
Latest state of alert evaluation.
- threshold: AlertV2Operand | None = None¶
Threshold to user for alert evaluation, can be a column or a value.
- as_dict() dict¶
Serializes the AlertV2Evaluation into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2Evaluation into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2Evaluation¶
Deserializes the AlertV2Evaluation from a dictionary.
- class databricks.sdk.service.sql.AlertV2Notification(notify_on_ok: 'Optional[bool]' = None, retrigger_seconds: 'Optional[int]' = None, subscriptions: 'Optional[List[AlertV2Subscription]]' = None)¶
- notify_on_ok: bool | None = None¶
Whether to notify alert subscribers when alert returns back to normal.
- retrigger_seconds: int | None = None¶
Number of seconds an alert waits after being triggered before it is allowed to send another notification. If set to 0 or omitted, the alert will not send any further notifications after the first trigger Setting this value to 1 allows the alert to send a notification on every evaluation where the condition is met, effectively making it always retrigger for notification purposes.
- subscriptions: List[AlertV2Subscription] | None = None¶
- as_dict() dict¶
Serializes the AlertV2Notification into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2Notification into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2Notification¶
Deserializes the AlertV2Notification from a dictionary.
- class databricks.sdk.service.sql.AlertV2Operand(column: 'Optional[AlertV2OperandColumn]' = None, value: 'Optional[AlertV2OperandValue]' = None)¶
- column: AlertV2OperandColumn | None = None¶
- value: AlertV2OperandValue | None = None¶
- as_dict() dict¶
Serializes the AlertV2Operand into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2Operand into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2Operand¶
Deserializes the AlertV2Operand from a dictionary.
- class databricks.sdk.service.sql.AlertV2OperandColumn(name: 'str', aggregation: 'Optional[Aggregation]' = None, display: 'Optional[str]' = None)¶
- name: str¶
- aggregation: Aggregation | None = None¶
If not set, the behavior is equivalent to using First row in the UI.
- display: str | None = None¶
- as_dict() dict¶
Serializes the AlertV2OperandColumn into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2OperandColumn into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2OperandColumn¶
Deserializes the AlertV2OperandColumn from a dictionary.
- class databricks.sdk.service.sql.AlertV2OperandValue(bool_value: 'Optional[bool]' = None, double_value: 'Optional[float]' = None, string_value: 'Optional[str]' = None)¶
- bool_value: bool | None = None¶
- double_value: float | None = None¶
- string_value: str | None = None¶
- as_dict() dict¶
Serializes the AlertV2OperandValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2OperandValue into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2OperandValue¶
Deserializes the AlertV2OperandValue from a dictionary.
- class databricks.sdk.service.sql.AlertV2RunAs(service_principal_name: 'Optional[str]' = None, user_name: 'Optional[str]' = None)¶
- service_principal_name: str | None = None¶
Application ID of an active service principal. Setting this field requires the servicePrincipal/user role.
- user_name: str | None = None¶
The email of an active workspace user. Can only set this field to their own email.
- as_dict() dict¶
Serializes the AlertV2RunAs into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2RunAs into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2RunAs¶
Deserializes the AlertV2RunAs from a dictionary.
- class databricks.sdk.service.sql.AlertV2Subscription(destination_id: 'Optional[str]' = None, user_email: 'Optional[str]' = None)¶
- destination_id: str | None = None¶
- user_email: str | None = None¶
- as_dict() dict¶
Serializes the AlertV2Subscription into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the AlertV2Subscription into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) AlertV2Subscription¶
Deserializes the AlertV2Subscription from a dictionary.
- class databricks.sdk.service.sql.BaseChunkInfo(byte_count: 'Optional[int]' = None, chunk_index: 'Optional[int]' = None, row_count: 'Optional[int]' = None, row_offset: 'Optional[int]' = None)¶
- byte_count: int | None = None¶
The number of bytes in the result chunk. This field is not available when using INLINE disposition.
- chunk_index: int | None = None¶
The position within the sequence of result set chunks.
- row_count: int | None = None¶
The number of rows within the result chunk.
- row_offset: int | None = None¶
The starting row offset within the result set.
- as_dict() dict¶
Serializes the BaseChunkInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the BaseChunkInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) BaseChunkInfo¶
Deserializes the BaseChunkInfo from a dictionary.
- class databricks.sdk.service.sql.Channel(dbsql_version: str | None = None, name: ChannelName | None = None)¶
Configures the channel name and DBSQL version of the warehouse. CHANNEL_NAME_CUSTOM should be chosen only when dbsql_version is specified.
- dbsql_version: str | None = None¶
- name: ChannelName | None = None¶
- as_dict() dict¶
Serializes the Channel into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Channel into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.ChannelInfo(dbsql_version: str | None = None, name: ChannelName | None = None)¶
Details about a Channel.
- dbsql_version: str | None = None¶
DB SQL Version the Channel is mapped to.
- name: ChannelName | None = None¶
Name of the channel
- as_dict() dict¶
Serializes the ChannelInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ChannelInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ChannelInfo¶
Deserializes the ChannelInfo from a dictionary.
- class databricks.sdk.service.sql.ChannelName¶
- CHANNEL_NAME_CURRENT = "CHANNEL_NAME_CURRENT"¶
- CHANNEL_NAME_CUSTOM = "CHANNEL_NAME_CUSTOM"¶
- CHANNEL_NAME_PREVIEW = "CHANNEL_NAME_PREVIEW"¶
- CHANNEL_NAME_PREVIOUS = "CHANNEL_NAME_PREVIOUS"¶
- class databricks.sdk.service.sql.ClientConfig(allow_custom_js_visualizations: 'Optional[bool]' = None, allow_downloads: 'Optional[bool]' = None, allow_external_shares: 'Optional[bool]' = None, allow_subscriptions: 'Optional[bool]' = None, date_format: 'Optional[str]' = None, date_time_format: 'Optional[str]' = None, disable_publish: 'Optional[bool]' = None, enable_legacy_autodetect_types: 'Optional[bool]' = None, feature_show_permissions_control: 'Optional[bool]' = None, hide_plotly_mode_bar: 'Optional[bool]' = None)¶
- allow_custom_js_visualizations: bool | None = None¶
allow_custom_js_visualizations is not supported/implemneted.
- allow_downloads: bool | None = None¶
- allow_subscriptions: bool | None = None¶
- date_format: str | None = None¶
- date_time_format: str | None = None¶
- disable_publish: bool | None = None¶
- enable_legacy_autodetect_types: bool | None = None¶
- feature_show_permissions_control: bool | None = None¶
- hide_plotly_mode_bar: bool | None = None¶
- as_dict() dict¶
Serializes the ClientConfig into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ClientConfig into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ClientConfig¶
Deserializes the ClientConfig from a dictionary.
- class databricks.sdk.service.sql.ColumnInfo(name: 'Optional[str]' = None, position: 'Optional[int]' = None, type_interval_type: 'Optional[str]' = None, type_name: 'Optional[ColumnInfoTypeName]' = None, type_precision: 'Optional[int]' = None, type_scale: 'Optional[int]' = None, type_text: 'Optional[str]' = None)¶
- name: str | None = None¶
The name of the column.
- position: int | None = None¶
The ordinal position of the column (starting at position 0).
- type_interval_type: str | None = None¶
The format of the interval type.
- type_name: ColumnInfoTypeName | None = None¶
The name of the base data type. This doesn’t include details for complex types such as STRUCT, MAP or ARRAY.
- type_precision: int | None = None¶
Specifies the number of digits in a number. This applies to the DECIMAL type.
- type_scale: int | None = None¶
Specifies the number of digits to the right of the decimal point in a number. This applies to the DECIMAL type.
- type_text: str | None = None¶
The full SQL type specification.
- as_dict() dict¶
Serializes the ColumnInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ColumnInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ColumnInfo¶
Deserializes the ColumnInfo from a dictionary.
- class databricks.sdk.service.sql.ColumnInfoTypeName¶
The name of the base data type. This doesn’t include details for complex types such as STRUCT, MAP or ARRAY.
- ARRAY = "ARRAY"¶
- BINARY = "BINARY"¶
- BOOLEAN = "BOOLEAN"¶
- BYTE = "BYTE"¶
- CHAR = "CHAR"¶
- DATE = "DATE"¶
- DECIMAL = "DECIMAL"¶
- DOUBLE = "DOUBLE"¶
- FLOAT = "FLOAT"¶
- INT = "INT"¶
- INTERVAL = "INTERVAL"¶
- LONG = "LONG"¶
- MAP = "MAP"¶
- NULL = "NULL"¶
- SHORT = "SHORT"¶
- STRING = "STRING"¶
- STRUCT = "STRUCT"¶
- TIMESTAMP = "TIMESTAMP"¶
- USER_DEFINED_TYPE = "USER_DEFINED_TYPE"¶
- class databricks.sdk.service.sql.ComparisonOperator¶
- EQUAL = "EQUAL"¶
- GREATER_THAN = "GREATER_THAN"¶
- GREATER_THAN_OR_EQUAL = "GREATER_THAN_OR_EQUAL"¶
- IS_NOT_NULL = "IS_NOT_NULL"¶
- IS_NULL = "IS_NULL"¶
- LESS_THAN = "LESS_THAN"¶
- LESS_THAN_OR_EQUAL = "LESS_THAN_OR_EQUAL"¶
- NOT_EQUAL = "NOT_EQUAL"¶
- class databricks.sdk.service.sql.CreateAlertRequestAlert(condition: 'Optional[AlertCondition]' = None, custom_body: 'Optional[str]' = None, custom_subject: 'Optional[str]' = None, display_name: 'Optional[str]' = None, notify_on_ok: 'Optional[bool]' = None, parent_path: 'Optional[str]' = None, query_id: 'Optional[str]' = None, seconds_to_retrigger: 'Optional[int]' = None)¶
- condition: AlertCondition | None = None¶
Trigger conditions of the alert.
- custom_body: str | None = None¶
Custom body of alert notification, if it exists. See [here] for custom templating instructions.
- custom_subject: str | None = None¶
Custom subject of alert notification, if it exists. This can include email subject entries and Slack notification headers, for example. See [here] for custom templating instructions.
- display_name: str | None = None¶
The display name of the alert.
- notify_on_ok: bool | None = None¶
Whether to notify alert subscribers when alert returns back to normal.
- parent_path: str | None = None¶
The workspace path of the folder containing the alert.
- query_id: str | None = None¶
UUID of the query attached to the alert.
- seconds_to_retrigger: int | None = None¶
Number of seconds an alert must wait after being triggered to rearm itself. After rearming, it can be triggered again. If 0 or not specified, the alert will not be triggered again.
- as_dict() dict¶
Serializes the CreateAlertRequestAlert into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the CreateAlertRequestAlert into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateAlertRequestAlert¶
Deserializes the CreateAlertRequestAlert from a dictionary.
- class databricks.sdk.service.sql.CreateQueryRequestQuery(apply_auto_limit: 'Optional[bool]' = None, catalog: 'Optional[str]' = None, description: 'Optional[str]' = None, display_name: 'Optional[str]' = None, parameters: 'Optional[List[QueryParameter]]' = None, parent_path: 'Optional[str]' = None, query_text: 'Optional[str]' = None, run_as_mode: 'Optional[RunAsMode]' = None, schema: 'Optional[str]' = None, tags: 'Optional[List[str]]' = None, warehouse_id: 'Optional[str]' = None)¶
- apply_auto_limit: bool | None = None¶
Whether to apply a 1000 row limit to the query result.
- catalog: str | None = None¶
Name of the catalog where this query will be executed.
- description: str | None = None¶
General description that conveys additional information about this query such as usage notes.
- display_name: str | None = None¶
Display name of the query that appears in list views, widget headings, and on the query page.
- parameters: List[QueryParameter] | None = None¶
List of query parameter definitions.
- parent_path: str | None = None¶
Workspace path of the workspace folder containing the object.
- query_text: str | None = None¶
Text of the query to be run.
- schema: str | None = None¶
Name of the schema where this query will be executed.
- tags: List[str] | None = None¶
- warehouse_id: str | None = None¶
ID of the SQL warehouse attached to the query.
- as_dict() dict¶
Serializes the CreateQueryRequestQuery into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the CreateQueryRequestQuery into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateQueryRequestQuery¶
Deserializes the CreateQueryRequestQuery from a dictionary.
- class databricks.sdk.service.sql.CreateVisualizationRequestVisualization(display_name: 'Optional[str]' = None, query_id: 'Optional[str]' = None, serialized_options: 'Optional[str]' = None, serialized_query_plan: 'Optional[str]' = None, type: 'Optional[str]' = None)¶
- display_name: str | None = None¶
The display name of the visualization.
- query_id: str | None = None¶
UUID of the query that the visualization is attached to.
- serialized_options: str | None = None¶
The visualization options varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying visualization options directly.
- serialized_query_plan: str | None = None¶
The visualization query plan varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying the visualization query plan directly.
- type: str | None = None¶
The type of visualization: counter, table, funnel, and so on.
- as_dict() dict¶
Serializes the CreateVisualizationRequestVisualization into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the CreateVisualizationRequestVisualization into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateVisualizationRequestVisualization¶
Deserializes the CreateVisualizationRequestVisualization from a dictionary.
- class databricks.sdk.service.sql.CreateWarehouseRequestWarehouseType¶
- CLASSIC = "CLASSIC"¶
- PRO = "PRO"¶
- TYPE_UNSPECIFIED = "TYPE_UNSPECIFIED"¶
- class databricks.sdk.service.sql.CreateWarehouseResponse(id: 'Optional[str]' = None)¶
- id: str | None = None¶
Id for the SQL warehouse. This value is unique across all SQL warehouses.
- as_dict() dict¶
Serializes the CreateWarehouseResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the CreateWarehouseResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CreateWarehouseResponse¶
Deserializes the CreateWarehouseResponse from a dictionary.
- class databricks.sdk.service.sql.CronSchedule(quartz_cron_schedule: 'str', timezone_id: 'str', pause_status: 'Optional[SchedulePauseStatus]' = None)¶
- quartz_cron_schedule: str¶
A cron expression using quartz syntax that specifies the schedule for this pipeline. Should use the quartz format described here: http://www.quartz-scheduler.org/documentation/quartz-2.1.7/tutorials/tutorial-lesson-06.html
- timezone_id: str¶
A Java timezone id. The schedule will be resolved using this timezone. This will be combined with the quartz_cron_schedule to determine the schedule. See https://docs.databricks.com/sql/language-manual/sql-ref-syntax-aux-conf-mgmt-set-timezone.html for details.
- pause_status: SchedulePauseStatus | None = None¶
Indicate whether this schedule is paused or not.
- as_dict() dict¶
Serializes the CronSchedule into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the CronSchedule into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) CronSchedule¶
Deserializes the CronSchedule from a dictionary.
- class databricks.sdk.service.sql.Dashboard(can_edit: bool | None = None, created_at: str | None = None, dashboard_filters_enabled: bool | None = None, id: str | None = None, is_archived: bool | None = None, is_draft: bool | None = None, is_favorite: bool | None = None, name: str | None = None, options: DashboardOptions | None = None, parent: str | None = None, permission_tier: PermissionLevel | None = None, slug: str | None = None, tags: List[str] | None = None, updated_at: str | None = None, user: User | None = None, user_id: int | None = None, widgets: List[Widget] | None = None)¶
A JSON representing a dashboard containing widgets of visualizations and text boxes.
- can_edit: bool | None = None¶
Whether the authenticated user can edit the query definition.
- created_at: str | None = None¶
Timestamp when this dashboard was created.
- dashboard_filters_enabled: bool | None = None¶
In the web application, query filters that share a name are coupled to a single selection box if this value is true.
- id: str | None = None¶
The ID for this dashboard.
- is_archived: bool | None = None¶
Indicates whether a dashboard is trashed. Trashed dashboards won’t appear in list views. If this boolean is true, the options property for this dashboard includes a moved_to_trash_at timestamp. Items in trash are permanently deleted after 30 days.
- is_draft: bool | None = None¶
Whether a dashboard is a draft. Draft dashboards only appear in list views for their owners.
- is_favorite: bool | None = None¶
Indicates whether this query object appears in the current user’s favorites list. This flag determines whether the star icon for favorites is selected.
- name: str | None = None¶
The title of the dashboard that appears in list views and at the top of the dashboard page.
- options: DashboardOptions | None = None¶
- parent: str | None = None¶
The identifier of the workspace folder containing the object.
- permission_tier: PermissionLevel | None = None¶
CAN_VIEW: Can view the query * CAN_RUN: Can run the query * CAN_EDIT: Can edit the query
CAN_MANAGE: Can manage the query
- slug: str | None = None¶
URL slug. Usually mirrors the query name with dashes (-) instead of spaces. Appears in the URL for this query.
- tags: List[str] | None = None¶
- updated_at: str | None = None¶
Timestamp when this dashboard was last updated.
- user_id: int | None = None¶
The ID of the user who owns the dashboard.
- as_dict() dict¶
Serializes the Dashboard into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Dashboard into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.DashboardOptions(moved_to_trash_at: 'Optional[str]' = None)¶
- moved_to_trash_at: str | None = None¶
The timestamp when this dashboard was moved to trash. Only present when the is_archived property is true. Trashed items are deleted after thirty days.
- as_dict() dict¶
Serializes the DashboardOptions into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DashboardOptions into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DashboardOptions¶
Deserializes the DashboardOptions from a dictionary.
- class databricks.sdk.service.sql.DataSource(id: str | None = None, name: str | None = None, pause_reason: str | None = None, paused: int | None = None, supports_auto_limit: bool | None = None, syntax: str | None = None, type: str | None = None, view_only: bool | None = None, warehouse_id: str | None = None)¶
A JSON object representing a DBSQL data source / SQL warehouse.
- id: str | None = None¶
Data source ID maps to the ID of the data source used by the resource and is distinct from the warehouse ID. [Learn more]
[Learn more]: https://docs.databricks.com/api/workspace/datasources/list
- name: str | None = None¶
The string name of this data source / SQL warehouse as it appears in the Databricks SQL web application.
- pause_reason: str | None = None¶
Reserved for internal use.
- paused: int | None = None¶
Reserved for internal use.
- supports_auto_limit: bool | None = None¶
Reserved for internal use.
- syntax: str | None = None¶
Reserved for internal use.
- type: str | None = None¶
The type of data source. For SQL warehouses, this will be databricks_internal.
- view_only: bool | None = None¶
Reserved for internal use.
- warehouse_id: str | None = None¶
The ID of the associated SQL warehouse, if this data source is backed by a SQL warehouse.
- as_dict() dict¶
Serializes the DataSource into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DataSource into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DataSource¶
Deserializes the DataSource from a dictionary.
- class databricks.sdk.service.sql.DatePrecision¶
- DAY_PRECISION = "DAY_PRECISION"¶
- MINUTE_PRECISION = "MINUTE_PRECISION"¶
- SECOND_PRECISION = "SECOND_PRECISION"¶
- class databricks.sdk.service.sql.DateRange(start: 'str', end: 'str')¶
- start: str¶
- end: str¶
- as_dict() dict¶
Serializes the DateRange into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DateRange into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.DateRangeValue(date_range_value: 'Optional[DateRange]' = None, dynamic_date_range_value: 'Optional[DateRangeValueDynamicDateRange]' = None, precision: 'Optional[DatePrecision]' = None, start_day_of_week: 'Optional[int]' = None)¶
-
- dynamic_date_range_value: DateRangeValueDynamicDateRange | None = None¶
Dynamic date-time range value based on current date-time.
- precision: DatePrecision | None = None¶
Date-time precision to format the value into when the query is run. Defaults to DAY_PRECISION (YYYY-MM-DD).
- start_day_of_week: int | None = None¶
- as_dict() dict¶
Serializes the DateRangeValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DateRangeValue into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DateRangeValue¶
Deserializes the DateRangeValue from a dictionary.
- class databricks.sdk.service.sql.DateRangeValueDynamicDateRange¶
- LAST_12_MONTHS = "LAST_12_MONTHS"¶
- LAST_14_DAYS = "LAST_14_DAYS"¶
- LAST_24_HOURS = "LAST_24_HOURS"¶
- LAST_30_DAYS = "LAST_30_DAYS"¶
- LAST_60_DAYS = "LAST_60_DAYS"¶
- LAST_7_DAYS = "LAST_7_DAYS"¶
- LAST_8_HOURS = "LAST_8_HOURS"¶
- LAST_90_DAYS = "LAST_90_DAYS"¶
- LAST_HOUR = "LAST_HOUR"¶
- LAST_MONTH = "LAST_MONTH"¶
- LAST_WEEK = "LAST_WEEK"¶
- LAST_YEAR = "LAST_YEAR"¶
- THIS_MONTH = "THIS_MONTH"¶
- THIS_WEEK = "THIS_WEEK"¶
- THIS_YEAR = "THIS_YEAR"¶
- TODAY = "TODAY"¶
- YESTERDAY = "YESTERDAY"¶
- class databricks.sdk.service.sql.DateValue(date_value: 'Optional[str]' = None, dynamic_date_value: 'Optional[DateValueDynamicDate]' = None, precision: 'Optional[DatePrecision]' = None)¶
- date_value: str | None = None¶
Manually specified date-time value.
- dynamic_date_value: DateValueDynamicDate | None = None¶
Dynamic date-time value based on current date-time.
- precision: DatePrecision | None = None¶
Date-time precision to format the value into when the query is run. Defaults to DAY_PRECISION (YYYY-MM-DD).
- as_dict() dict¶
Serializes the DateValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DateValue into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.DefaultWarehouseOverride(type: DefaultWarehouseOverrideType, default_warehouse_override_id: str | None = None, name: str | None = None, warehouse_id: str | None = None)¶
Represents a per-user default warehouse override configuration. This resource allows users or administrators to customize how a user’s default warehouse is selected for SQL operations. If no override exists for a user, the workspace default warehouse will be used.
- type: DefaultWarehouseOverrideType¶
The type of override behavior.
- default_warehouse_override_id: str | None = None¶
The ID component of the resource name (user ID).
- name: str | None = None¶
The resource name of the default warehouse override. Format: default-warehouse-overrides/{default_warehouse_override_id}
- warehouse_id: str | None = None¶
The specific warehouse ID when type is CUSTOM. Not set for LAST_SELECTED type.
- as_dict() dict¶
Serializes the DefaultWarehouseOverride into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DefaultWarehouseOverride into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DefaultWarehouseOverride¶
Deserializes the DefaultWarehouseOverride from a dictionary.
- class databricks.sdk.service.sql.DefaultWarehouseOverrideType¶
Type of default warehouse override behavior.
- CUSTOM = "CUSTOM"¶
- LAST_SELECTED = "LAST_SELECTED"¶
- class databricks.sdk.service.sql.DeleteResponse¶
- as_dict() dict¶
Serializes the DeleteResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DeleteResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DeleteResponse¶
Deserializes the DeleteResponse from a dictionary.
- class databricks.sdk.service.sql.DeleteWarehouseResponse¶
- as_dict() dict¶
Serializes the DeleteWarehouseResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the DeleteWarehouseResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) DeleteWarehouseResponse¶
Deserializes the DeleteWarehouseResponse from a dictionary.
- class databricks.sdk.service.sql.EditWarehouseRequestWarehouseType¶
- CLASSIC = "CLASSIC"¶
- PRO = "PRO"¶
- TYPE_UNSPECIFIED = "TYPE_UNSPECIFIED"¶
- class databricks.sdk.service.sql.EditWarehouseResponse¶
- as_dict() dict¶
Serializes the EditWarehouseResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EditWarehouseResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) EditWarehouseResponse¶
Deserializes the EditWarehouseResponse from a dictionary.
- class databricks.sdk.service.sql.Empty¶
Represents an empty message, similar to google.protobuf.Empty, which is not available in the firm right now.
- as_dict() dict¶
Serializes the Empty into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Empty into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.EndpointConfPair(key: 'Optional[str]' = None, value: 'Optional[str]' = None)¶
- key: str | None = None¶
- value: str | None = None¶
- as_dict() dict¶
Serializes the EndpointConfPair into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EndpointConfPair into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) EndpointConfPair¶
Deserializes the EndpointConfPair from a dictionary.
- class databricks.sdk.service.sql.EndpointHealth(details: 'Optional[str]' = None, failure_reason: 'Optional[TerminationReason]' = None, message: 'Optional[str]' = None, status: 'Optional[Status]' = None, summary: 'Optional[str]' = None)¶
- details: str | None = None¶
Details about errors that are causing current degraded/failed status.
- failure_reason: TerminationReason | None = None¶
The reason for failure to bring up clusters for this warehouse. This is available when status is ‘FAILED’ and sometimes when it is DEGRADED.
- message: str | None = None¶
Deprecated. split into summary and details for security
- summary: str | None = None¶
A short summary of the health status in case of degraded/failed warehouses.
- as_dict() dict¶
Serializes the EndpointHealth into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EndpointHealth into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) EndpointHealth¶
Deserializes the EndpointHealth from a dictionary.
- class databricks.sdk.service.sql.EndpointInfo(auto_stop_mins: 'Optional[int]' = None, channel: 'Optional[Channel]' = None, cluster_size: 'Optional[str]' = None, creator_name: 'Optional[str]' = None, enable_photon: 'Optional[bool]' = None, enable_serverless_compute: 'Optional[bool]' = None, health: 'Optional[EndpointHealth]' = None, id: 'Optional[str]' = None, instance_profile_arn: 'Optional[str]' = None, jdbc_url: 'Optional[str]' = None, max_num_clusters: 'Optional[int]' = None, min_num_clusters: 'Optional[int]' = None, name: 'Optional[str]' = None, num_active_sessions: 'Optional[int]' = None, num_clusters: 'Optional[int]' = None, odbc_params: 'Optional[OdbcParams]' = None, spot_instance_policy: 'Optional[SpotInstancePolicy]' = None, state: 'Optional[State]' = None, tags: 'Optional[EndpointTags]' = None, warehouse_type: 'Optional[EndpointInfoWarehouseType]' = None)¶
- auto_stop_mins: int | None = None¶
The amount of time in minutes that a SQL warehouse must be idle (i.e., no RUNNING queries) before it is automatically stopped.
Supported values: - Must be == 0 or >= 10 mins - 0 indicates no autostop.
Defaults to 120 mins
- cluster_size: str | None = None¶
Size of the clusters allocated for this warehouse. Increasing the size of a spark cluster allows you to run larger queries on it. If you want to increase the number of concurrent queries, please tune max_num_clusters.
Supported values: - 2X-Small - X-Small - Small - Medium - Large - X-Large - 2X-Large - 3X-Large - 4X-Large - 5X-Large
- creator_name: str | None = None¶
warehouse creator name
- enable_photon: bool | None = None¶
Configures whether the warehouse should use Photon optimized clusters.
Defaults to true.
- enable_serverless_compute: bool | None = None¶
Configures whether the warehouse should use serverless compute
- health: EndpointHealth | None = None¶
Optional health status. Assume the warehouse is healthy if this field is not set.
- id: str | None = None¶
unique identifier for warehouse
- instance_profile_arn: str | None = None¶
Deprecated. Instance profile used to pass IAM role to the cluster
- jdbc_url: str | None = None¶
the jdbc connection string for this warehouse
- max_num_clusters: int | None = None¶
Maximum number of clusters that the autoscaler will create to handle concurrent queries.
Supported values: - Must be >= min_num_clusters - Must be <= 40.
Defaults to min_clusters if unset.
- min_num_clusters: int | None = None¶
Minimum number of available clusters that will be maintained for this SQL warehouse. Increasing this will ensure that a larger number of clusters are always running and therefore may reduce the cold start time for new queries. This is similar to reserved vs. revocable cores in a resource manager.
Supported values: - Must be > 0 - Must be <= min(max_num_clusters, 30)
Defaults to 1
- name: str | None = None¶
Logical name for the cluster.
Supported values: - Must be unique within an org. - Must be less than 100 characters.
- num_active_sessions: int | None = None¶
Deprecated. current number of active sessions for the warehouse
- num_clusters: int | None = None¶
current number of clusters running for the service
- odbc_params: OdbcParams | None = None¶
ODBC parameters for the SQL warehouse
- spot_instance_policy: SpotInstancePolicy | None = None¶
Configurations whether the endpoint should use spot instances.
- tags: EndpointTags | None = None¶
A set of key-value pairs that will be tagged on all resources (e.g., AWS instances and EBS volumes) associated with this SQL warehouse.
Supported values: - Number of tags < 45.
- warehouse_type: EndpointInfoWarehouseType | None = None¶
Warehouse type: PRO or CLASSIC. If you want to use serverless compute, you must set to PRO and also set the field enable_serverless_compute to true.
- as_dict() dict¶
Serializes the EndpointInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EndpointInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) EndpointInfo¶
Deserializes the EndpointInfo from a dictionary.
- class databricks.sdk.service.sql.EndpointInfoWarehouseType¶
- CLASSIC = "CLASSIC"¶
- PRO = "PRO"¶
- TYPE_UNSPECIFIED = "TYPE_UNSPECIFIED"¶
- class databricks.sdk.service.sql.EndpointTagPair(key: 'Optional[str]' = None, value: 'Optional[str]' = None)¶
- key: str | None = None¶
- value: str | None = None¶
- as_dict() dict¶
Serializes the EndpointTagPair into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EndpointTagPair into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) EndpointTagPair¶
Deserializes the EndpointTagPair from a dictionary.
- class databricks.sdk.service.sql.EndpointTags(custom_tags: 'Optional[List[EndpointTagPair]]' = None)¶
- custom_tags: List[EndpointTagPair] | None = None¶
- as_dict() dict¶
Serializes the EndpointTags into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EndpointTags into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) EndpointTags¶
Deserializes the EndpointTags from a dictionary.
- class databricks.sdk.service.sql.EnumValue(enum_options: 'Optional[str]' = None, multi_values_options: 'Optional[MultiValuesOptions]' = None, values: 'Optional[List[str]]' = None)¶
- enum_options: str | None = None¶
List of valid query parameter values, newline delimited.
- multi_values_options: MultiValuesOptions | None = None¶
If specified, allows multiple values to be selected for this parameter.
- values: List[str] | None = None¶
List of selected query parameter values.
- as_dict() dict¶
Serializes the EnumValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the EnumValue into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.ExecuteStatementRequestOnWaitTimeout¶
When wait_timeout > 0s, the call will block up to the specified time. If the statement execution doesn’t finish within this time, on_wait_timeout determines whether the execution should continue or be canceled. When set to CONTINUE, the statement execution continues asynchronously and the call returns a statement ID which can be used for polling with :method:statementexecution/getStatement. When set to CANCEL, the statement execution is canceled and the call returns with a CANCELED state.
- CANCEL = "CANCEL"¶
- CONTINUE = "CONTINUE"¶
- class databricks.sdk.service.sql.ExternalLink(byte_count: 'Optional[int]' = None, chunk_index: 'Optional[int]' = None, expiration: 'Optional[str]' = None, external_link: 'Optional[str]' = None, http_headers: 'Optional[Dict[str, str]]' = None, next_chunk_index: 'Optional[int]' = None, next_chunk_internal_link: 'Optional[str]' = None, row_count: 'Optional[int]' = None, row_offset: 'Optional[int]' = None)¶
- byte_count: int | None = None¶
The number of bytes in the result chunk. This field is not available when using INLINE disposition.
- chunk_index: int | None = None¶
The position within the sequence of result set chunks.
- expiration: str | None = None¶
Indicates the date-time that the given external link will expire and becomes invalid, after which point a new external_link must be requested.
- external_link: str | None = None¶
A URL pointing to a chunk of result data, hosted by an external service, with a short expiration time (<= 15 minutes). As this URL contains a temporary credential, it should be considered sensitive and the client should not expose this URL in a log.
- http_headers: Dict[str, str] | None = None¶
HTTP headers that must be included with a GET request to the external_link. Each header is provided as a key-value pair. Headers are typically used to pass a decryption key to the external service. The values of these headers should be considered sensitive and the client should not expose these values in a log.
- next_chunk_index: int | None = None¶
When fetching, provides the chunk_index for the _next_ chunk. If absent, indicates there are no more chunks. The next chunk can be fetched with a :method:statementexecution/getstatementresultchunkn request.
- next_chunk_internal_link: str | None = None¶
When fetching, provides a link to fetch the _next_ chunk. If absent, indicates there are no more chunks. This link is an absolute path to be joined with your $DATABRICKS_HOST, and should be treated as an opaque link. This is an alternative to using next_chunk_index.
- row_count: int | None = None¶
The number of rows within the result chunk.
- row_offset: int | None = None¶
The starting row offset within the result set.
- as_dict() dict¶
Serializes the ExternalLink into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ExternalLink into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ExternalLink¶
Deserializes the ExternalLink from a dictionary.
- class databricks.sdk.service.sql.ExternalQuerySource(alert_id: 'Optional[str]' = None, dashboard_id: 'Optional[str]' = None, genie_space_id: 'Optional[str]' = None, job_info: 'Optional[ExternalQuerySourceJobInfo]' = None, legacy_dashboard_id: 'Optional[str]' = None, notebook_id: 'Optional[str]' = None, sql_query_id: 'Optional[str]' = None)¶
- alert_id: str | None = None¶
The canonical identifier for this SQL alert
- dashboard_id: str | None = None¶
The canonical identifier for this Lakeview dashboard
- genie_space_id: str | None = None¶
The canonical identifier for this Genie space
- job_info: ExternalQuerySourceJobInfo | None = None¶
- legacy_dashboard_id: str | None = None¶
The canonical identifier for this legacy dashboard
- notebook_id: str | None = None¶
The canonical identifier for this notebook
- sql_query_id: str | None = None¶
The canonical identifier for this SQL query
- as_dict() dict¶
Serializes the ExternalQuerySource into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ExternalQuerySource into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ExternalQuerySource¶
Deserializes the ExternalQuerySource from a dictionary.
- class databricks.sdk.service.sql.ExternalQuerySourceJobInfo(job_id: 'Optional[str]' = None, job_run_id: 'Optional[str]' = None, job_task_run_id: 'Optional[str]' = None)¶
- job_id: str | None = None¶
The canonical identifier for this job.
- job_run_id: str | None = None¶
The canonical identifier of the run. This ID is unique across all runs of all jobs.
- job_task_run_id: str | None = None¶
The canonical identifier of the task run.
- as_dict() dict¶
Serializes the ExternalQuerySourceJobInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ExternalQuerySourceJobInfo into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ExternalQuerySourceJobInfo¶
Deserializes the ExternalQuerySourceJobInfo from a dictionary.
- class databricks.sdk.service.sql.Format¶
- ARROW_STREAM = "ARROW_STREAM"¶
- CSV = "CSV"¶
- JSON_ARRAY = "JSON_ARRAY"¶
- class databricks.sdk.service.sql.GetResponse(access_control_list: 'Optional[List[AccessControl]]' = None, object_id: 'Optional[str]' = None, object_type: 'Optional[ObjectType]' = None)¶
- access_control_list: List[AccessControl] | None = None¶
- object_id: str | None = None¶
An object’s type and UUID, separated by a forward slash (/) character.
- object_type: ObjectType | None = None¶
A singular noun object type.
- as_dict() dict¶
Serializes the GetResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the GetResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GetResponse¶
Deserializes the GetResponse from a dictionary.
- class databricks.sdk.service.sql.GetWarehousePermissionLevelsResponse(permission_levels: 'Optional[List[WarehousePermissionsDescription]]' = None)¶
- permission_levels: List[WarehousePermissionsDescription] | None = None¶
Specific permission levels
- as_dict() dict¶
Serializes the GetWarehousePermissionLevelsResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the GetWarehousePermissionLevelsResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GetWarehousePermissionLevelsResponse¶
Deserializes the GetWarehousePermissionLevelsResponse from a dictionary.
- class databricks.sdk.service.sql.GetWarehouseResponse(auto_stop_mins: 'Optional[int]' = None, channel: 'Optional[Channel]' = None, cluster_size: 'Optional[str]' = None, creator_name: 'Optional[str]' = None, enable_photon: 'Optional[bool]' = None, enable_serverless_compute: 'Optional[bool]' = None, health: 'Optional[EndpointHealth]' = None, id: 'Optional[str]' = None, instance_profile_arn: 'Optional[str]' = None, jdbc_url: 'Optional[str]' = None, max_num_clusters: 'Optional[int]' = None, min_num_clusters: 'Optional[int]' = None, name: 'Optional[str]' = None, num_active_sessions: 'Optional[int]' = None, num_clusters: 'Optional[int]' = None, odbc_params: 'Optional[OdbcParams]' = None, spot_instance_policy: 'Optional[SpotInstancePolicy]' = None, state: 'Optional[State]' = None, tags: 'Optional[EndpointTags]' = None, warehouse_type: 'Optional[GetWarehouseResponseWarehouseType]' = None)¶
- auto_stop_mins: int | None = None¶
The amount of time in minutes that a SQL warehouse must be idle (i.e., no RUNNING queries) before it is automatically stopped.
Supported values: - Must be == 0 or >= 10 mins - 0 indicates no autostop.
Defaults to 120 mins
- cluster_size: str | None = None¶
Size of the clusters allocated for this warehouse. Increasing the size of a spark cluster allows you to run larger queries on it. If you want to increase the number of concurrent queries, please tune max_num_clusters.
Supported values: - 2X-Small - X-Small - Small - Medium - Large - X-Large - 2X-Large - 3X-Large - 4X-Large - 5X-Large
- creator_name: str | None = None¶
warehouse creator name
- enable_photon: bool | None = None¶
Configures whether the warehouse should use Photon optimized clusters.
Defaults to true.
- enable_serverless_compute: bool | None = None¶
Configures whether the warehouse should use serverless compute
- health: EndpointHealth | None = None¶
Optional health status. Assume the warehouse is healthy if this field is not set.
- id: str | None = None¶
unique identifier for warehouse
- instance_profile_arn: str | None = None¶
Deprecated. Instance profile used to pass IAM role to the cluster
- jdbc_url: str | None = None¶
the jdbc connection string for this warehouse
- max_num_clusters: int | None = None¶
Maximum number of clusters that the autoscaler will create to handle concurrent queries.
Supported values: - Must be >= min_num_clusters - Must be <= 40.
Defaults to min_clusters if unset.
- min_num_clusters: int | None = None¶
Minimum number of available clusters that will be maintained for this SQL warehouse. Increasing this will ensure that a larger number of clusters are always running and therefore may reduce the cold start time for new queries. This is similar to reserved vs. revocable cores in a resource manager.
Supported values: - Must be > 0 - Must be <= min(max_num_clusters, 30)
Defaults to 1
- name: str | None = None¶
Logical name for the cluster.
Supported values: - Must be unique within an org. - Must be less than 100 characters.
- num_active_sessions: int | None = None¶
Deprecated. current number of active sessions for the warehouse
- num_clusters: int | None = None¶
current number of clusters running for the service
- odbc_params: OdbcParams | None = None¶
ODBC parameters for the SQL warehouse
- spot_instance_policy: SpotInstancePolicy | None = None¶
Configurations whether the endpoint should use spot instances.
- tags: EndpointTags | None = None¶
A set of key-value pairs that will be tagged on all resources (e.g., AWS instances and EBS volumes) associated with this SQL warehouse.
Supported values: - Number of tags < 45.
- warehouse_type: GetWarehouseResponseWarehouseType | None = None¶
Warehouse type: PRO or CLASSIC. If you want to use serverless compute, you must set to PRO and also set the field enable_serverless_compute to true.
- as_dict() dict¶
Serializes the GetWarehouseResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the GetWarehouseResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GetWarehouseResponse¶
Deserializes the GetWarehouseResponse from a dictionary.
- class databricks.sdk.service.sql.GetWarehouseResponseWarehouseType¶
- CLASSIC = "CLASSIC"¶
- PRO = "PRO"¶
- TYPE_UNSPECIFIED = "TYPE_UNSPECIFIED"¶
- class databricks.sdk.service.sql.GetWorkspaceWarehouseConfigResponse(channel: 'Optional[Channel]' = None, config_param: 'Optional[RepeatedEndpointConfPairs]' = None, data_access_config: 'Optional[List[EndpointConfPair]]' = None, enable_serverless_compute: 'Optional[bool]' = None, enabled_warehouse_types: 'Optional[List[WarehouseTypePair]]' = None, global_param: 'Optional[RepeatedEndpointConfPairs]' = None, google_service_account: 'Optional[str]' = None, instance_profile_arn: 'Optional[str]' = None, security_policy: 'Optional[GetWorkspaceWarehouseConfigResponseSecurityPolicy]' = None, sql_configuration_parameters: 'Optional[RepeatedEndpointConfPairs]' = None)¶
-
- config_param: RepeatedEndpointConfPairs | None = None¶
Deprecated: Use sql_configuration_parameters
- data_access_config: List[EndpointConfPair] | None = None¶
Spark confs for external hive metastore configuration JSON serialized size must be less than <= 512K
- enable_serverless_compute: bool | None = None¶
Deprecated: only setting this to true is allowed.
- enabled_warehouse_types: List[WarehouseTypePair] | None = None¶
List of Warehouse Types allowed in this workspace (limits allowed value of the type field in CreateWarehouse and EditWarehouse). Note: Some types cannot be disabled, they don’t need to be specified in SetWorkspaceWarehouseConfig. Note: Disabling a type may cause existing warehouses to be converted to another type. Used by frontend to save specific type availability in the warehouse create and edit form UI.
- global_param: RepeatedEndpointConfPairs | None = None¶
Deprecated: Use sql_configuration_parameters
- google_service_account: str | None = None¶
GCP only: Google Service Account used to pass to cluster to access Google Cloud Storage
- instance_profile_arn: str | None = None¶
AWS Only: The instance profile used to pass an IAM role to the SQL warehouses. This configuration is also applied to the workspace’s serverless compute for notebooks and jobs.
- security_policy: GetWorkspaceWarehouseConfigResponseSecurityPolicy | None = None¶
Security policy for warehouses
- sql_configuration_parameters: RepeatedEndpointConfPairs | None = None¶
SQL configuration parameters
- as_dict() dict¶
Serializes the GetWorkspaceWarehouseConfigResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the GetWorkspaceWarehouseConfigResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) GetWorkspaceWarehouseConfigResponse¶
Deserializes the GetWorkspaceWarehouseConfigResponse from a dictionary.
- class databricks.sdk.service.sql.GetWorkspaceWarehouseConfigResponseSecurityPolicy¶
Security policy to be used for warehouses
- DATA_ACCESS_CONTROL = "DATA_ACCESS_CONTROL"¶
- NONE = "NONE"¶
- PASSTHROUGH = "PASSTHROUGH"¶
- class databricks.sdk.service.sql.LegacyAlert(created_at: 'Optional[str]' = None, id: 'Optional[str]' = None, last_triggered_at: 'Optional[str]' = None, name: 'Optional[str]' = None, options: 'Optional[AlertOptions]' = None, parent: 'Optional[str]' = None, query: 'Optional[AlertQuery]' = None, rearm: 'Optional[int]' = None, state: 'Optional[LegacyAlertState]' = None, updated_at: 'Optional[str]' = None, user: 'Optional[User]' = None)¶
- created_at: str | None = None¶
Timestamp when the alert was created.
- id: str | None = None¶
Alert ID.
- last_triggered_at: str | None = None¶
Timestamp when the alert was last triggered.
- name: str | None = None¶
Name of the alert.
- options: AlertOptions | None = None¶
Alert configuration options.
- parent: str | None = None¶
The identifier of the workspace folder containing the object.
- query: AlertQuery | None = None¶
- rearm: int | None = None¶
Number of seconds after being triggered before the alert rearms itself and can be triggered again. If null, alert will never be triggered again.
- state: LegacyAlertState | None = None¶
State of the alert. Possible values are: unknown (yet to be evaluated), triggered (evaluated and fulfilled trigger conditions), or ok (evaluated and did not fulfill trigger conditions).
- updated_at: str | None = None¶
Timestamp when the alert was last updated.
- as_dict() dict¶
Serializes the LegacyAlert into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the LegacyAlert into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) LegacyAlert¶
Deserializes the LegacyAlert from a dictionary.
- class databricks.sdk.service.sql.LegacyAlertState¶
- OK = "OK"¶
- TRIGGERED = "TRIGGERED"¶
- UNKNOWN = "UNKNOWN"¶
- class databricks.sdk.service.sql.LegacyQuery(can_edit: 'Optional[bool]' = None, created_at: 'Optional[str]' = None, data_source_id: 'Optional[str]' = None, description: 'Optional[str]' = None, id: 'Optional[str]' = None, is_archived: 'Optional[bool]' = None, is_draft: 'Optional[bool]' = None, is_favorite: 'Optional[bool]' = None, is_safe: 'Optional[bool]' = None, last_modified_by: 'Optional[User]' = None, last_modified_by_id: 'Optional[int]' = None, latest_query_data_id: 'Optional[str]' = None, name: 'Optional[str]' = None, options: 'Optional[QueryOptions]' = None, parent: 'Optional[str]' = None, permission_tier: 'Optional[PermissionLevel]' = None, query: 'Optional[str]' = None, query_hash: 'Optional[str]' = None, run_as_role: 'Optional[RunAsRole]' = None, tags: 'Optional[List[str]]' = None, updated_at: 'Optional[str]' = None, user: 'Optional[User]' = None, user_id: 'Optional[int]' = None, visualizations: 'Optional[List[LegacyVisualization]]' = None)¶
- can_edit: bool | None = None¶
Describes whether the authenticated user is allowed to edit the definition of this query.
- created_at: str | None = None¶
The timestamp when this query was created.
- data_source_id: str | None = None¶
Data source ID maps to the ID of the data source used by the resource and is distinct from the warehouse ID. [Learn more]
[Learn more]: https://docs.databricks.com/api/workspace/datasources/list
- description: str | None = None¶
General description that conveys additional information about this query such as usage notes.
- id: str | None = None¶
Query ID.
- is_archived: bool | None = None¶
Indicates whether the query is trashed. Trashed queries can’t be used in dashboards, or appear in search results. If this boolean is true, the options property for this query includes a moved_to_trash_at timestamp. Trashed queries are permanently deleted after 30 days.
- is_draft: bool | None = None¶
Whether the query is a draft. Draft queries only appear in list views for their owners. Visualizations from draft queries cannot appear on dashboards.
- is_favorite: bool | None = None¶
Whether this query object appears in the current user’s favorites list. This flag determines whether the star icon for favorites is selected.
- is_safe: bool | None = None¶
Text parameter types are not safe from SQL injection for all types of data source. Set this Boolean parameter to true if a query either does not use any text type parameters or uses a data source type where text type parameters are handled safely.
- last_modified_by_id: int | None = None¶
The ID of the user who last saved changes to this query.
- latest_query_data_id: str | None = None¶
If there is a cached result for this query and user, this field includes the query result ID. If this query uses parameters, this field is always null.
- name: str | None = None¶
The title of this query that appears in list views, widget headings, and on the query page.
- options: QueryOptions | None = None¶
- parent: str | None = None¶
The identifier of the workspace folder containing the object.
- permission_tier: PermissionLevel | None = None¶
CAN_VIEW: Can view the query * CAN_RUN: Can run the query * CAN_EDIT: Can edit the query
CAN_MANAGE: Can manage the query
- query: str | None = None¶
The text of the query to be run.
- query_hash: str | None = None¶
A SHA-256 hash of the query text along with the authenticated user ID.
- run_as_role: RunAsRole | None = None¶
Sets the Run as role for the object. Must be set to one of “viewer” (signifying “run as viewer” behavior) or “owner” (signifying “run as owner” behavior)
- tags: List[str] | None = None¶
- updated_at: str | None = None¶
The timestamp at which this query was last updated.
- user_id: int | None = None¶
The ID of the user who owns the query.
- visualizations: List[LegacyVisualization] | None = None¶
- as_dict() dict¶
Serializes the LegacyQuery into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the LegacyQuery into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) LegacyQuery¶
Deserializes the LegacyQuery from a dictionary.
- class databricks.sdk.service.sql.LegacyVisualization(created_at: str | None = None, description: str | None = None, id: str | None = None, name: str | None = None, options: Any | None = None, query: LegacyQuery | None = None, type: str | None = None, updated_at: str | None = None)¶
The visualization description API changes frequently and is unsupported. You can duplicate a visualization by copying description objects received _from the API_ and then using them to create a new one with a POST request to the same endpoint. Databricks does not recommend constructing ad-hoc visualizations entirely in JSON.
- created_at: str | None = None¶
- description: str | None = None¶
A short description of this visualization. This is not displayed in the UI.
- id: str | None = None¶
The UUID for this visualization.
- name: str | None = None¶
The name of the visualization that appears on dashboards and the query screen.
- options: Any | None = None¶
The options object varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying visualization settings in JSON.
- query: LegacyQuery | None = None¶
- type: str | None = None¶
The type of visualization: chart, table, pivot table, and so on.
- updated_at: str | None = None¶
- as_dict() dict¶
Serializes the LegacyVisualization into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the LegacyVisualization into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) LegacyVisualization¶
Deserializes the LegacyVisualization from a dictionary.
- class databricks.sdk.service.sql.ListAlertsResponse(next_page_token: 'Optional[str]' = None, results: 'Optional[List[ListAlertsResponseAlert]]' = None)¶
- next_page_token: str | None = None¶
- results: List[ListAlertsResponseAlert] | None = None¶
- as_dict() dict¶
Serializes the ListAlertsResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListAlertsResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListAlertsResponse¶
Deserializes the ListAlertsResponse from a dictionary.
- class databricks.sdk.service.sql.ListAlertsResponseAlert(condition: 'Optional[AlertCondition]' = None, create_time: 'Optional[str]' = None, custom_body: 'Optional[str]' = None, custom_subject: 'Optional[str]' = None, display_name: 'Optional[str]' = None, id: 'Optional[str]' = None, lifecycle_state: 'Optional[LifecycleState]' = None, notify_on_ok: 'Optional[bool]' = None, owner_user_name: 'Optional[str]' = None, query_id: 'Optional[str]' = None, seconds_to_retrigger: 'Optional[int]' = None, state: 'Optional[AlertState]' = None, trigger_time: 'Optional[str]' = None, update_time: 'Optional[str]' = None)¶
- condition: AlertCondition | None = None¶
Trigger conditions of the alert.
- create_time: str | None = None¶
The timestamp indicating when the alert was created.
- custom_body: str | None = None¶
Custom body of alert notification, if it exists. See [here] for custom templating instructions.
- custom_subject: str | None = None¶
Custom subject of alert notification, if it exists. This can include email subject entries and Slack notification headers, for example. See [here] for custom templating instructions.
- display_name: str | None = None¶
The display name of the alert.
- id: str | None = None¶
UUID identifying the alert.
- lifecycle_state: LifecycleState | None = None¶
The workspace state of the alert. Used for tracking trashed status.
- notify_on_ok: bool | None = None¶
Whether to notify alert subscribers when alert returns back to normal.
- owner_user_name: str | None = None¶
The owner’s username. This field is set to “Unavailable” if the user has been deleted.
- query_id: str | None = None¶
UUID of the query attached to the alert.
- seconds_to_retrigger: int | None = None¶
Number of seconds an alert must wait after being triggered to rearm itself. After rearming, it can be triggered again. If 0 or not specified, the alert will not be triggered again.
- state: AlertState | None = None¶
Current state of the alert’s trigger status. This field is set to UNKNOWN if the alert has not yet been evaluated or ran into an error during the last evaluation.
- trigger_time: str | None = None¶
Timestamp when the alert was last triggered, if the alert has been triggered before.
- update_time: str | None = None¶
The timestamp indicating when the alert was updated.
- as_dict() dict¶
Serializes the ListAlertsResponseAlert into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListAlertsResponseAlert into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListAlertsResponseAlert¶
Deserializes the ListAlertsResponseAlert from a dictionary.
- class databricks.sdk.service.sql.ListAlertsV2Response(alerts: 'Optional[List[AlertV2]]' = None, next_page_token: 'Optional[str]' = None)¶
-
- next_page_token: str | None = None¶
- as_dict() dict¶
Serializes the ListAlertsV2Response into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListAlertsV2Response into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListAlertsV2Response¶
Deserializes the ListAlertsV2Response from a dictionary.
- class databricks.sdk.service.sql.ListDefaultWarehouseOverridesResponse(default_warehouse_overrides: List[DefaultWarehouseOverride] | None = None, next_page_token: str | None = None)¶
Response message for ListDefaultWarehouseOverrides.
- default_warehouse_overrides: List[DefaultWarehouseOverride] | None = None¶
The default warehouse overrides in the workspace.
- next_page_token: str | None = None¶
A token, which can be sent as page_token to retrieve the next page. If this field is omitted, there are no subsequent pages.
- as_dict() dict¶
Serializes the ListDefaultWarehouseOverridesResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListDefaultWarehouseOverridesResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListDefaultWarehouseOverridesResponse¶
Deserializes the ListDefaultWarehouseOverridesResponse from a dictionary.
- class databricks.sdk.service.sql.ListQueriesResponse(has_next_page: 'Optional[bool]' = None, next_page_token: 'Optional[str]' = None, res: 'Optional[List[QueryInfo]]' = None)¶
- has_next_page: bool | None = None¶
Whether there is another page of results.
- next_page_token: str | None = None¶
A token that can be used to get the next page of results.
- as_dict() dict¶
Serializes the ListQueriesResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListQueriesResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListQueriesResponse¶
Deserializes the ListQueriesResponse from a dictionary.
- class databricks.sdk.service.sql.ListQueryObjectsResponse(next_page_token: 'Optional[str]' = None, results: 'Optional[List[ListQueryObjectsResponseQuery]]' = None)¶
- next_page_token: str | None = None¶
- results: List[ListQueryObjectsResponseQuery] | None = None¶
- as_dict() dict¶
Serializes the ListQueryObjectsResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListQueryObjectsResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListQueryObjectsResponse¶
Deserializes the ListQueryObjectsResponse from a dictionary.
- class databricks.sdk.service.sql.ListQueryObjectsResponseQuery(apply_auto_limit: 'Optional[bool]' = None, catalog: 'Optional[str]' = None, create_time: 'Optional[str]' = None, description: 'Optional[str]' = None, display_name: 'Optional[str]' = None, id: 'Optional[str]' = None, last_modifier_user_name: 'Optional[str]' = None, lifecycle_state: 'Optional[LifecycleState]' = None, owner_user_name: 'Optional[str]' = None, parameters: 'Optional[List[QueryParameter]]' = None, query_text: 'Optional[str]' = None, run_as_mode: 'Optional[RunAsMode]' = None, schema: 'Optional[str]' = None, tags: 'Optional[List[str]]' = None, update_time: 'Optional[str]' = None, warehouse_id: 'Optional[str]' = None)¶
- apply_auto_limit: bool | None = None¶
Whether to apply a 1000 row limit to the query result.
- catalog: str | None = None¶
Name of the catalog where this query will be executed.
- create_time: str | None = None¶
Timestamp when this query was created.
- description: str | None = None¶
General description that conveys additional information about this query such as usage notes.
- display_name: str | None = None¶
Display name of the query that appears in list views, widget headings, and on the query page.
- id: str | None = None¶
UUID identifying the query.
- last_modifier_user_name: str | None = None¶
Username of the user who last saved changes to this query.
- lifecycle_state: LifecycleState | None = None¶
Indicates whether the query is trashed.
- owner_user_name: str | None = None¶
Username of the user that owns the query.
- parameters: List[QueryParameter] | None = None¶
List of query parameter definitions.
- query_text: str | None = None¶
Text of the query to be run.
- schema: str | None = None¶
Name of the schema where this query will be executed.
- tags: List[str] | None = None¶
- update_time: str | None = None¶
Timestamp when this query was last updated.
- warehouse_id: str | None = None¶
ID of the SQL warehouse attached to the query.
- as_dict() dict¶
Serializes the ListQueryObjectsResponseQuery into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListQueryObjectsResponseQuery into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListQueryObjectsResponseQuery¶
Deserializes the ListQueryObjectsResponseQuery from a dictionary.
- class databricks.sdk.service.sql.ListResponse(count: 'Optional[int]' = None, page: 'Optional[int]' = None, page_size: 'Optional[int]' = None, results: 'Optional[List[Dashboard]]' = None)¶
- count: int | None = None¶
The total number of dashboards.
- page: int | None = None¶
The current page being displayed.
- page_size: int | None = None¶
The number of dashboards per page.
- as_dict() dict¶
Serializes the ListResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListResponse¶
Deserializes the ListResponse from a dictionary.
- class databricks.sdk.service.sql.ListVisualizationsForQueryResponse(next_page_token: 'Optional[str]' = None, results: 'Optional[List[Visualization]]' = None)¶
- next_page_token: str | None = None¶
- results: List[Visualization] | None = None¶
- as_dict() dict¶
Serializes the ListVisualizationsForQueryResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListVisualizationsForQueryResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListVisualizationsForQueryResponse¶
Deserializes the ListVisualizationsForQueryResponse from a dictionary.
- class databricks.sdk.service.sql.ListWarehousesResponse(next_page_token: 'Optional[str]' = None, warehouses: 'Optional[List[EndpointInfo]]' = None)¶
- next_page_token: str | None = None¶
A token, which can be sent as page_token to retrieve the next page. If this field is omitted, there are no subsequent pages.
- warehouses: List[EndpointInfo] | None = None¶
A list of warehouses and their configurations.
- as_dict() dict¶
Serializes the ListWarehousesResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ListWarehousesResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ListWarehousesResponse¶
Deserializes the ListWarehousesResponse from a dictionary.
- class databricks.sdk.service.sql.MultiValuesOptions(prefix: 'Optional[str]' = None, separator: 'Optional[str]' = None, suffix: 'Optional[str]' = None)¶
- prefix: str | None = None¶
Character that prefixes each selected parameter value.
- separator: str | None = None¶
Character that separates each selected parameter value. Defaults to a comma.
- suffix: str | None = None¶
Character that suffixes each selected parameter value.
- as_dict() dict¶
Serializes the MultiValuesOptions into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the MultiValuesOptions into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) MultiValuesOptions¶
Deserializes the MultiValuesOptions from a dictionary.
- class databricks.sdk.service.sql.NumericValue(value: 'Optional[float]' = None)¶
- value: float | None = None¶
- as_dict() dict¶
Serializes the NumericValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the NumericValue into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) NumericValue¶
Deserializes the NumericValue from a dictionary.
- class databricks.sdk.service.sql.ObjectType¶
A singular noun object type.
- ALERT = "ALERT"¶
- DASHBOARD = "DASHBOARD"¶
- DATA_SOURCE = "DATA_SOURCE"¶
- QUERY = "QUERY"¶
- class databricks.sdk.service.sql.ObjectTypePlural¶
Always a plural of the object type.
- ALERTS = "ALERTS"¶
- DASHBOARDS = "DASHBOARDS"¶
- DATA_SOURCES = "DATA_SOURCES"¶
- QUERIES = "QUERIES"¶
- class databricks.sdk.service.sql.OdbcParams(hostname: 'Optional[str]' = None, path: 'Optional[str]' = None, port: 'Optional[int]' = None, protocol: 'Optional[str]' = None)¶
- hostname: str | None = None¶
- path: str | None = None¶
- port: int | None = None¶
- protocol: str | None = None¶
- as_dict() dict¶
Serializes the OdbcParams into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the OdbcParams into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) OdbcParams¶
Deserializes the OdbcParams from a dictionary.
- class databricks.sdk.service.sql.OwnableObjectType¶
- ALERT = "ALERT"¶
- DASHBOARD = "DASHBOARD"¶
- QUERY = "QUERY"¶
- class databricks.sdk.service.sql.Parameter(enum_options: 'Optional[str]' = None, multi_values_options: 'Optional[MultiValuesOptions]' = None, name: 'Optional[str]' = None, query_id: 'Optional[str]' = None, title: 'Optional[str]' = None, type: 'Optional[ParameterType]' = None, value: 'Optional[Any]' = None)¶
- enum_options: str | None = None¶
List of valid parameter values, newline delimited. Only applies for dropdown list parameters.
- multi_values_options: MultiValuesOptions | None = None¶
If specified, allows multiple values to be selected for this parameter. Only applies to dropdown list and query-based dropdown list parameters.
- name: str | None = None¶
The literal parameter marker that appears between double curly braces in the query text.
- query_id: str | None = None¶
The UUID of the query that provides the parameter values. Only applies for query-based dropdown list parameters.
- title: str | None = None¶
The text displayed in a parameter picking widget.
- type: ParameterType | None = None¶
Parameters can have several different types.
- value: Any | None = None¶
The default value for this parameter.
- as_dict() dict¶
Serializes the Parameter into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Parameter into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.ParameterType¶
- DATETIME = "DATETIME"¶
- ENUM = "ENUM"¶
- NUMBER = "NUMBER"¶
- QUERY = "QUERY"¶
- TEXT = "TEXT"¶
- class databricks.sdk.service.sql.PermissionLevel¶
CAN_VIEW: Can view the query * CAN_RUN: Can run the query * CAN_EDIT: Can edit the query * CAN_MANAGE: Can manage the query
- CAN_EDIT = "CAN_EDIT"¶
- CAN_MANAGE = "CAN_MANAGE"¶
- CAN_RUN = "CAN_RUN"¶
- CAN_VIEW = "CAN_VIEW"¶
- class databricks.sdk.service.sql.PlansState¶
Possible Reasons for which we have not saved plans in the database
- EMPTY = "EMPTY"¶
- EXISTS = "EXISTS"¶
- IGNORED_LARGE_PLANS_SIZE = "IGNORED_LARGE_PLANS_SIZE"¶
- IGNORED_SMALL_DURATION = "IGNORED_SMALL_DURATION"¶
- IGNORED_SPARK_PLAN_TYPE = "IGNORED_SPARK_PLAN_TYPE"¶
- UNKNOWN = "UNKNOWN"¶
- class databricks.sdk.service.sql.Query(apply_auto_limit: 'Optional[bool]' = None, catalog: 'Optional[str]' = None, create_time: 'Optional[str]' = None, description: 'Optional[str]' = None, display_name: 'Optional[str]' = None, id: 'Optional[str]' = None, last_modifier_user_name: 'Optional[str]' = None, lifecycle_state: 'Optional[LifecycleState]' = None, owner_user_name: 'Optional[str]' = None, parameters: 'Optional[List[QueryParameter]]' = None, parent_path: 'Optional[str]' = None, query_text: 'Optional[str]' = None, run_as_mode: 'Optional[RunAsMode]' = None, schema: 'Optional[str]' = None, tags: 'Optional[List[str]]' = None, update_time: 'Optional[str]' = None, warehouse_id: 'Optional[str]' = None)¶
- apply_auto_limit: bool | None = None¶
Whether to apply a 1000 row limit to the query result.
- catalog: str | None = None¶
Name of the catalog where this query will be executed.
- create_time: str | None = None¶
Timestamp when this query was created.
- description: str | None = None¶
General description that conveys additional information about this query such as usage notes.
- display_name: str | None = None¶
Display name of the query that appears in list views, widget headings, and on the query page.
- id: str | None = None¶
UUID identifying the query.
- last_modifier_user_name: str | None = None¶
Username of the user who last saved changes to this query.
- lifecycle_state: LifecycleState | None = None¶
Indicates whether the query is trashed.
- owner_user_name: str | None = None¶
Username of the user that owns the query.
- parameters: List[QueryParameter] | None = None¶
List of query parameter definitions.
- parent_path: str | None = None¶
Workspace path of the workspace folder containing the object.
- query_text: str | None = None¶
Text of the query to be run.
- schema: str | None = None¶
Name of the schema where this query will be executed.
- tags: List[str] | None = None¶
- update_time: str | None = None¶
Timestamp when this query was last updated.
- warehouse_id: str | None = None¶
ID of the SQL warehouse attached to the query.
- as_dict() dict¶
Serializes the Query into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Query into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.QueryBackedValue(multi_values_options: 'Optional[MultiValuesOptions]' = None, query_id: 'Optional[str]' = None, values: 'Optional[List[str]]' = None)¶
- multi_values_options: MultiValuesOptions | None = None¶
If specified, allows multiple values to be selected for this parameter.
- query_id: str | None = None¶
UUID of the query that provides the parameter values.
- values: List[str] | None = None¶
List of selected query parameter values.
- as_dict() dict¶
Serializes the QueryBackedValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryBackedValue into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) QueryBackedValue¶
Deserializes the QueryBackedValue from a dictionary.
- class databricks.sdk.service.sql.QueryFilter(query_start_time_range: 'Optional[TimeRange]' = None, statement_ids: 'Optional[List[str]]' = None, statuses: 'Optional[List[QueryStatus]]' = None, user_ids: 'Optional[List[int]]' = None, warehouse_ids: 'Optional[List[str]]' = None)¶
- query_start_time_range: TimeRange | None = None¶
A range filter for query submitted time. The time range must be less than or equal to 30 days.
- statement_ids: List[str] | None = None¶
A list of statement IDs.
- statuses: List[QueryStatus] | None = None¶
A list of statuses (QUEUED, RUNNING, CANCELED, FAILED, FINISHED) to match query results. Corresponds to the status field in the response. Filtering for multiple statuses is not recommended. Instead, opt to filter by a single status multiple times and then combine the results.
- user_ids: List[int] | None = None¶
A list of user IDs who ran the queries.
- warehouse_ids: List[str] | None = None¶
A list of warehouse IDs.
- as_dict() dict¶
Serializes the QueryFilter into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryFilter into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) QueryFilter¶
Deserializes the QueryFilter from a dictionary.
- class databricks.sdk.service.sql.QueryInfo(cache_query_id: 'Optional[str]' = None, channel_used: 'Optional[ChannelInfo]' = None, client_application: 'Optional[str]' = None, duration: 'Optional[int]' = None, endpoint_id: 'Optional[str]' = None, error_message: 'Optional[str]' = None, executed_as_user_id: 'Optional[int]' = None, executed_as_user_name: 'Optional[str]' = None, execution_end_time_ms: 'Optional[int]' = None, is_final: 'Optional[bool]' = None, lookup_key: 'Optional[str]' = None, metrics: 'Optional[QueryMetrics]' = None, plans_state: 'Optional[PlansState]' = None, query_end_time_ms: 'Optional[int]' = None, query_id: 'Optional[str]' = None, query_source: 'Optional[ExternalQuerySource]' = None, query_start_time_ms: 'Optional[int]' = None, query_tags: 'Optional[List[QueryTag]]' = None, query_text: 'Optional[str]' = None, rows_produced: 'Optional[int]' = None, session_id: 'Optional[str]' = None, spark_ui_url: 'Optional[str]' = None, statement_type: 'Optional[QueryStatementType]' = None, status: 'Optional[QueryStatus]' = None, user_id: 'Optional[int]' = None, user_name: 'Optional[str]' = None, warehouse_id: 'Optional[str]' = None)¶
- cache_query_id: str | None = None¶
The ID of the cached query if this result retrieved from cache
- channel_used: ChannelInfo | None = None¶
SQL Warehouse channel information at the time of query execution
- client_application: str | None = None¶
Client application that ran the statement. For example: Databricks SQL Editor, Tableau, and Power BI. This field is derived from information provided by client applications. While values are expected to remain static over time, this cannot be guaranteed.
- duration: int | None = None¶
Total time of the statement execution. This value does not include the time taken to retrieve the results, which can result in a discrepancy between this value and the start-to-finish wall-clock time.
- endpoint_id: str | None = None¶
Alias for warehouse_id.
- error_message: str | None = None¶
Message describing why the query could not complete.
- executed_as_user_id: int | None = None¶
The ID of the user whose credentials were used to run the query.
- executed_as_user_name: str | None = None¶
The email address or username of the user whose credentials were used to run the query.
- execution_end_time_ms: int | None = None¶
The time execution of the query ended.
- is_final: bool | None = None¶
Whether more updates for the query are expected.
- lookup_key: str | None = None¶
A key that can be used to look up query details.
- metrics: QueryMetrics | None = None¶
Metrics about query execution.
- plans_state: PlansState | None = None¶
Whether plans exist for the execution, or the reason why they are missing
- query_end_time_ms: int | None = None¶
The time the query ended.
- query_id: str | None = None¶
The query ID.
- query_source: ExternalQuerySource | None = None¶
A struct that contains key-value pairs representing Databricks entities that were involved in the execution of this statement, such as jobs, notebooks, or dashboards. This field only records Databricks entities.
- query_start_time_ms: int | None = None¶
The time the query started.
- query_tags: List[QueryTag] | None = None¶
A query execution can be optionally annotated with query tags
- query_text: str | None = None¶
The text of the query.
- rows_produced: int | None = None¶
The number of results returned by the query.
- session_id: str | None = None¶
The spark session UUID that query ran on. This is either the Spark Connect, DBSQL, or SDP session ID.
- spark_ui_url: str | None = None¶
URL to the Spark UI query plan.
- statement_type: QueryStatementType | None = None¶
Type of statement for this query
- status: QueryStatus | None = None¶
Query status with one the following values:
QUEUED: Query has been received and queued. - RUNNING: Query has started. - CANCELED:
Query has been cancelled by the user. - FAILED: Query has failed. - FINISHED: Query has completed.
- user_id: int | None = None¶
The ID of the user who ran the query.
- user_name: str | None = None¶
The email address or username of the user who ran the query.
- warehouse_id: str | None = None¶
Warehouse ID.
- as_dict() dict¶
Serializes the QueryInfo into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryInfo into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.QueryList(count: 'Optional[int]' = None, page: 'Optional[int]' = None, page_size: 'Optional[int]' = None, results: 'Optional[List[LegacyQuery]]' = None)¶
- count: int | None = None¶
The total number of queries.
- page: int | None = None¶
The page number that is currently displayed.
- page_size: int | None = None¶
The number of queries per page.
- results: List[LegacyQuery] | None = None¶
List of queries returned.
- as_dict() dict¶
Serializes the QueryList into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryList into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.QueryMetrics(compilation_time_ms: int | None = None, execution_time_ms: int | None = None, network_sent_bytes: int | None = None, overloading_queue_start_timestamp: int | None = None, photon_total_time_ms: int | None = None, projected_remaining_task_total_time_ms: int | None = None, projected_remaining_wallclock_time_ms: int | None = None, provisioning_queue_start_timestamp: int | None = None, pruned_bytes: int | None = None, pruned_files_count: int | None = None, query_compilation_start_timestamp: int | None = None, read_bytes: int | None = None, read_cache_bytes: int | None = None, read_files_bytes: int | None = None, read_files_count: int | None = None, read_partitions_count: int | None = None, read_remote_bytes: int | None = None, remaining_task_count: int | None = None, result_fetch_time_ms: int | None = None, result_from_cache: bool | None = None, rows_produced_count: int | None = None, rows_read_count: int | None = None, runnable_tasks: int | None = None, spill_to_disk_bytes: int | None = None, task_time_over_time_range: TaskTimeOverRange | None = None, task_total_time_ms: int | None = None, total_time_ms: int | None = None, work_to_be_done: int | None = None, write_remote_bytes: int | None = None)¶
A query metric that encapsulates a set of measurements for a single query. Metrics come from the driver and are stored in the history service database.
- compilation_time_ms: int | None = None¶
Time spent loading metadata and optimizing the query, in milliseconds.
- execution_time_ms: int | None = None¶
Time spent executing the query, in milliseconds.
- network_sent_bytes: int | None = None¶
Total amount of data sent over the network between executor nodes during shuffle, in bytes.
- overloading_queue_start_timestamp: int | None = None¶
Timestamp of when the query was enqueued waiting while the warehouse was at max load. This field is optional and will not appear if the query skipped the overloading queue.
- photon_total_time_ms: int | None = None¶
Total execution time for all individual Photon query engine tasks in the query, in milliseconds.
- projected_remaining_task_total_time_ms: int | None = None¶
projected remaining work to be done aggregated across all stages in the query, in milliseconds
- projected_remaining_wallclock_time_ms: int | None = None¶
projected lower bound on remaining total task time based on projected_remaining_task_total_time_ms / maximum concurrency
- provisioning_queue_start_timestamp: int | None = None¶
Timestamp of when the query was enqueued waiting for a cluster to be provisioned for the warehouse. This field is optional and will not appear if the query skipped the provisioning queue.
- pruned_bytes: int | None = None¶
Total number of file bytes in all tables not read due to pruning
- pruned_files_count: int | None = None¶
Total number of files from all tables not read due to pruning
- query_compilation_start_timestamp: int | None = None¶
Timestamp of when the underlying compute started compilation of the query.
- read_bytes: int | None = None¶
Total size of data read by the query, in bytes.
- read_cache_bytes: int | None = None¶
Size of persistent data read from the cache, in bytes.
- read_files_bytes: int | None = None¶
Total number of file bytes in all tables read
- read_files_count: int | None = None¶
Number of files read after pruning
- read_partitions_count: int | None = None¶
Number of partitions read after pruning.
- read_remote_bytes: int | None = None¶
Size of persistent data read from cloud object storage on your cloud tenant, in bytes.
- remaining_task_count: int | None = None¶
number of remaining tasks to complete this is based on the current status and could be bigger or smaller in the future based on future updates
- result_fetch_time_ms: int | None = None¶
Time spent fetching the query results after the execution finished, in milliseconds.
- result_from_cache: bool | None = None¶
true if the query result was fetched from cache, false otherwise.
- rows_produced_count: int | None = None¶
Total number of rows returned by the query.
- rows_read_count: int | None = None¶
Total number of rows read by the query.
- runnable_tasks: int | None = None¶
number of remaining tasks to complete, calculated by autoscaler StatementAnalysis.scala deprecated: use remaining_task_count instead
- spill_to_disk_bytes: int | None = None¶
Size of data temporarily written to disk while executing the query, in bytes.
- task_time_over_time_range: TaskTimeOverRange | None = None¶
sum of task times completed in a range of wall clock time, approximated to a configurable number of points aggregated over all stages and jobs in the query (based on task_total_time_ms)
- task_total_time_ms: int | None = None¶
Sum of execution time for all of the query’s tasks, in milliseconds.
- total_time_ms: int | None = None¶
Total execution time of the query from the client’s point of view, in milliseconds.
- work_to_be_done: int | None = None¶
remaining work to be done across all stages in the query, calculated by autoscaler StatementAnalysis.scala, in milliseconds deprecated: using projected_remaining_task_total_time_ms instead
- write_remote_bytes: int | None = None¶
Size pf persistent data written to cloud object storage in your cloud tenant, in bytes.
- as_dict() dict¶
Serializes the QueryMetrics into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryMetrics into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) QueryMetrics¶
Deserializes the QueryMetrics from a dictionary.
- class databricks.sdk.service.sql.QueryOptions(catalog: 'Optional[str]' = None, moved_to_trash_at: 'Optional[str]' = None, parameters: 'Optional[List[Parameter]]' = None, schema: 'Optional[str]' = None)¶
- catalog: str | None = None¶
The name of the catalog to execute this query in.
- moved_to_trash_at: str | None = None¶
The timestamp when this query was moved to trash. Only present when the is_archived property is true. Trashed items are deleted after thirty days.
- schema: str | None = None¶
The name of the schema to execute this query in.
- as_dict() dict¶
Serializes the QueryOptions into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryOptions into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) QueryOptions¶
Deserializes the QueryOptions from a dictionary.
- class databricks.sdk.service.sql.QueryParameter(date_range_value: 'Optional[DateRangeValue]' = None, date_value: 'Optional[DateValue]' = None, enum_value: 'Optional[EnumValue]' = None, name: 'Optional[str]' = None, numeric_value: 'Optional[NumericValue]' = None, query_backed_value: 'Optional[QueryBackedValue]' = None, text_value: 'Optional[TextValue]' = None, title: 'Optional[str]' = None)¶
- date_range_value: DateRangeValue | None = None¶
Date-range query parameter value. Can only specify one of dynamic_date_range_value or date_range_value.
- date_value: DateValue | None = None¶
Date query parameter value. Can only specify one of dynamic_date_value or date_value.
- name: str | None = None¶
Literal parameter marker that appears between double curly braces in the query text.
- numeric_value: NumericValue | None = None¶
Numeric query parameter value.
- query_backed_value: QueryBackedValue | None = None¶
Query-based dropdown query parameter value.
- title: str | None = None¶
Text displayed in the user-facing parameter widget in the UI.
- as_dict() dict¶
Serializes the QueryParameter into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryParameter into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) QueryParameter¶
Deserializes the QueryParameter from a dictionary.
- class databricks.sdk.service.sql.QueryStatementType¶
- ALTER = "ALTER"¶
- ANALYZE = "ANALYZE"¶
- COPY = "COPY"¶
- CREATE = "CREATE"¶
- DELETE = "DELETE"¶
- DESCRIBE = "DESCRIBE"¶
- DROP = "DROP"¶
- EXPLAIN = "EXPLAIN"¶
- GRANT = "GRANT"¶
- INSERT = "INSERT"¶
- MERGE = "MERGE"¶
- OPTIMIZE = "OPTIMIZE"¶
- OTHER = "OTHER"¶
- REFRESH = "REFRESH"¶
- REPLACE = "REPLACE"¶
- REVOKE = "REVOKE"¶
- SELECT = "SELECT"¶
- SET = "SET"¶
- SHOW = "SHOW"¶
- TRUNCATE = "TRUNCATE"¶
- UPDATE = "UPDATE"¶
- USE = "USE"¶
- class databricks.sdk.service.sql.QueryStatus¶
Statuses which are also used by OperationStatus in runtime. When adding a new QueryStatus, make sure to update com.databricks.sqlgateway.history.QueryStatusOrdering
- CANCELED = "CANCELED"¶
- COMPILED = "COMPILED"¶
- COMPILING = "COMPILING"¶
- FAILED = "FAILED"¶
- FINISHED = "FINISHED"¶
- QUEUED = "QUEUED"¶
- RUNNING = "RUNNING"¶
- STARTED = "STARTED"¶
- class databricks.sdk.service.sql.QueryTag(key: str, value: str | None = None)¶
A query execution can be annotated with an optional key-value pair to allow users to attribute
the executions by key and optional value to filter by. QueryTag is the user-facing representation.
- key: str¶
- value: str | None = None¶
- as_dict() dict¶
Serializes the QueryTag into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the QueryTag into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.RepeatedEndpointConfPairs(config_pair: 'Optional[List[EndpointConfPair]]' = None, configuration_pairs: 'Optional[List[EndpointConfPair]]' = None)¶
- config_pair: List[EndpointConfPair] | None = None¶
Deprecated: Use configuration_pairs
- configuration_pairs: List[EndpointConfPair] | None = None¶
- as_dict() dict¶
Serializes the RepeatedEndpointConfPairs into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the RepeatedEndpointConfPairs into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) RepeatedEndpointConfPairs¶
Deserializes the RepeatedEndpointConfPairs from a dictionary.
- class databricks.sdk.service.sql.RestoreResponse¶
- as_dict() dict¶
Serializes the RestoreResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the RestoreResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) RestoreResponse¶
Deserializes the RestoreResponse from a dictionary.
- class databricks.sdk.service.sql.ResultData(byte_count: int | None = None, chunk_index: int | None = None, data_array: List[List[str]] | None = None, external_links: List[ExternalLink] | None = None, next_chunk_index: int | None = None, next_chunk_internal_link: str | None = None, row_count: int | None = None, row_offset: int | None = None)¶
Contains the result data of a single chunk when using INLINE disposition. When using EXTERNAL_LINKS disposition, the array external_links is used instead to provide URLs to the result data in cloud storage. Exactly one of these alternatives is used. (While the external_links array prepares the API to return multiple links in a single response. Currently only a single link is returned.)
- byte_count: int | None = None¶
The number of bytes in the result chunk. This field is not available when using INLINE disposition.
- chunk_index: int | None = None¶
The position within the sequence of result set chunks.
- data_array: List[List[str]] | None = None¶
The JSON_ARRAY format is an array of arrays of values, where each non-null value is formatted as a string. Null values are encoded as JSON null.
- external_links: List[ExternalLink] | None = None¶
- next_chunk_index: int | None = None¶
When fetching, provides the chunk_index for the _next_ chunk. If absent, indicates there are no more chunks. The next chunk can be fetched with a :method:statementexecution/getstatementresultchunkn request.
- next_chunk_internal_link: str | None = None¶
When fetching, provides a link to fetch the _next_ chunk. If absent, indicates there are no more chunks. This link is an absolute path to be joined with your $DATABRICKS_HOST, and should be treated as an opaque link. This is an alternative to using next_chunk_index.
- row_count: int | None = None¶
The number of rows within the result chunk.
- row_offset: int | None = None¶
The starting row offset within the result set.
- as_dict() dict¶
Serializes the ResultData into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ResultData into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ResultData¶
Deserializes the ResultData from a dictionary.
- class databricks.sdk.service.sql.ResultManifest(chunks: List[BaseChunkInfo] | None = None, format: Format | None = None, schema: ResultSchema | None = None, total_byte_count: int | None = None, total_chunk_count: int | None = None, total_row_count: int | None = None, truncated: bool | None = None)¶
The result manifest provides schema and metadata for the result set.
- chunks: List[BaseChunkInfo] | None = None¶
Array of result set chunk metadata.
- schema: ResultSchema | None = None¶
- total_byte_count: int | None = None¶
The total number of bytes in the result set. This field is not available when using INLINE disposition.
- total_chunk_count: int | None = None¶
The total number of chunks that the result set has been divided into.
- total_row_count: int | None = None¶
The total number of rows in the result set.
- truncated: bool | None = None¶
Indicates whether the result is truncated due to row_limit or byte_limit.
- as_dict() dict¶
Serializes the ResultManifest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ResultManifest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ResultManifest¶
Deserializes the ResultManifest from a dictionary.
- class databricks.sdk.service.sql.ResultSchema(column_count: int | None = None, columns: List[ColumnInfo] | None = None)¶
The schema is an ordered list of column descriptions.
- column_count: int | None = None¶
- columns: List[ColumnInfo] | None = None¶
- as_dict() dict¶
Serializes the ResultSchema into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ResultSchema into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ResultSchema¶
Deserializes the ResultSchema from a dictionary.
- class databricks.sdk.service.sql.ServiceError(error_code: 'Optional[ServiceErrorCode]' = None, message: 'Optional[str]' = None)¶
- error_code: ServiceErrorCode | None = None¶
- message: str | None = None¶
A brief summary of the error condition.
- as_dict() dict¶
Serializes the ServiceError into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the ServiceError into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) ServiceError¶
Deserializes the ServiceError from a dictionary.
- class databricks.sdk.service.sql.ServiceErrorCode¶
- ABORTED = "ABORTED"¶
- ALREADY_EXISTS = "ALREADY_EXISTS"¶
- BAD_REQUEST = "BAD_REQUEST"¶
- CANCELLED = "CANCELLED"¶
- DEADLINE_EXCEEDED = "DEADLINE_EXCEEDED"¶
- INTERNAL_ERROR = "INTERNAL_ERROR"¶
- IO_ERROR = "IO_ERROR"¶
- NOT_FOUND = "NOT_FOUND"¶
- RESOURCE_EXHAUSTED = "RESOURCE_EXHAUSTED"¶
- SERVICE_UNDER_MAINTENANCE = "SERVICE_UNDER_MAINTENANCE"¶
- TEMPORARILY_UNAVAILABLE = "TEMPORARILY_UNAVAILABLE"¶
- UNAUTHENTICATED = "UNAUTHENTICATED"¶
- UNKNOWN = "UNKNOWN"¶
- WORKSPACE_TEMPORARILY_UNAVAILABLE = "WORKSPACE_TEMPORARILY_UNAVAILABLE"¶
- class databricks.sdk.service.sql.SetResponse(access_control_list: 'Optional[List[AccessControl]]' = None, object_id: 'Optional[str]' = None, object_type: 'Optional[ObjectType]' = None)¶
- access_control_list: List[AccessControl] | None = None¶
- object_id: str | None = None¶
An object’s type and UUID, separated by a forward slash (/) character.
- object_type: ObjectType | None = None¶
A singular noun object type.
- as_dict() dict¶
Serializes the SetResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the SetResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) SetResponse¶
Deserializes the SetResponse from a dictionary.
- class databricks.sdk.service.sql.SetWorkspaceWarehouseConfigRequestSecurityPolicy¶
Security policy to be used for warehouses
- DATA_ACCESS_CONTROL = "DATA_ACCESS_CONTROL"¶
- NONE = "NONE"¶
- PASSTHROUGH = "PASSTHROUGH"¶
- class databricks.sdk.service.sql.SetWorkspaceWarehouseConfigResponse¶
- as_dict() dict¶
Serializes the SetWorkspaceWarehouseConfigResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the SetWorkspaceWarehouseConfigResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) SetWorkspaceWarehouseConfigResponse¶
Deserializes the SetWorkspaceWarehouseConfigResponse from a dictionary.
- class databricks.sdk.service.sql.SpotInstancePolicy¶
EndpointSpotInstancePolicy configures whether the endpoint should use spot instances. The breakdown of how the EndpointSpotInstancePolicy converts to per cloud configurations is: +——-+————————————–+——————————–+ | Cloud | COST_OPTIMIZED | RELIABILITY_OPTIMIZED | +——-+————————————–+——————————–+ | AWS | On Demand Driver with Spot Executors | On Demand Driver and Executors | | AZURE | On Demand Driver and Executors | On Demand Driver and Executors | +——-+————————————–+——————————–+ While including “spot” in the enum name may limit the the future extensibility of this field because it limits this enum to denoting “spot or not”, this is the field that PM recommends after discussion with customers per SC-48783.
- COST_OPTIMIZED = "COST_OPTIMIZED"¶
- POLICY_UNSPECIFIED = "POLICY_UNSPECIFIED"¶
- RELIABILITY_OPTIMIZED = "RELIABILITY_OPTIMIZED"¶
- class databricks.sdk.service.sql.StartWarehouseResponse¶
- as_dict() dict¶
Serializes the StartWarehouseResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the StartWarehouseResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) StartWarehouseResponse¶
Deserializes the StartWarehouseResponse from a dictionary.
- class databricks.sdk.service.sql.State¶
State of a warehouse.
- DELETED = "DELETED"¶
- DELETING = "DELETING"¶
- RUNNING = "RUNNING"¶
- STARTING = "STARTING"¶
- STOPPED = "STOPPED"¶
- STOPPING = "STOPPING"¶
- class databricks.sdk.service.sql.StatementParameterListItem(name: 'str', type: 'Optional[str]' = None, value: 'Optional[str]' = None)¶
- name: str¶
The name of a parameter marker to be substituted in the statement.
- type: str | None = None¶
The data type, given as a string. For example: INT, STRING, DECIMAL(10,2). If no type is given the type is assumed to be STRING. Complex types, such as ARRAY, MAP, and STRUCT are not supported. For valid types, refer to the section [Data types] of the SQL language reference.
- value: str | None = None¶
The value to substitute, represented as a string. If omitted, the value is interpreted as NULL.
- as_dict() dict¶
Serializes the StatementParameterListItem into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the StatementParameterListItem into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) StatementParameterListItem¶
Deserializes the StatementParameterListItem from a dictionary.
- class databricks.sdk.service.sql.StatementResponse(manifest: 'Optional[ResultManifest]' = None, result: 'Optional[ResultData]' = None, statement_id: 'Optional[str]' = None, status: 'Optional[StatementStatus]' = None)¶
- manifest: ResultManifest | None = None¶
- result: ResultData | None = None¶
- statement_id: str | None = None¶
The statement ID is returned upon successfully submitting a SQL statement, and is a required reference for all subsequent calls.
- status: StatementStatus | None = None¶
- as_dict() dict¶
Serializes the StatementResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the StatementResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) StatementResponse¶
Deserializes the StatementResponse from a dictionary.
- class databricks.sdk.service.sql.StatementState¶
- CANCELED = "CANCELED"¶
- CLOSED = "CLOSED"¶
- FAILED = "FAILED"¶
- PENDING = "PENDING"¶
- RUNNING = "RUNNING"¶
- SUCCEEDED = "SUCCEEDED"¶
- class databricks.sdk.service.sql.StatementStatus(error: ServiceError | None = None, sql_state: str | None = None, state: StatementState | None = None)¶
The status response includes execution state and if relevant, error information.
- error: ServiceError | None = None¶
- sql_state: str | None = None¶
SQLSTATE error code returned when the statement execution fails. Only populated when the statement status is FAILED.
- state: StatementState | None = None¶
Statement execution state: - PENDING: waiting for warehouse - RUNNING: running - SUCCEEDED: execution was successful, result data available for fetch - FAILED: execution failed; reason for failure described in accompanying error message - CANCELED: user canceled; can come from explicit cancel call, or timeout with on_wait_timeout=CANCEL - CLOSED: execution successful, and statement closed; result no longer available for fetch
- as_dict() dict¶
Serializes the StatementStatus into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the StatementStatus into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) StatementStatus¶
Deserializes the StatementStatus from a dictionary.
- class databricks.sdk.service.sql.Status¶
- DEGRADED = "DEGRADED"¶
- FAILED = "FAILED"¶
- HEALTHY = "HEALTHY"¶
- class databricks.sdk.service.sql.StopWarehouseResponse¶
- as_dict() dict¶
Serializes the StopWarehouseResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the StopWarehouseResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) StopWarehouseResponse¶
Deserializes the StopWarehouseResponse from a dictionary.
- class databricks.sdk.service.sql.Success(message: 'Optional[SuccessMessage]' = None)¶
- message: SuccessMessage | None = None¶
- as_dict() dict¶
Serializes the Success into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Success into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.TaskTimeOverRange(entries: 'Optional[List[TaskTimeOverRangeEntry]]' = None, interval: 'Optional[int]' = None)¶
- entries: List[TaskTimeOverRangeEntry] | None = None¶
- interval: int | None = None¶
interval length for all entries (difference in start time and end time of an entry range) the same for all entries start time of first interval is query_start_time_ms
- as_dict() dict¶
Serializes the TaskTimeOverRange into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the TaskTimeOverRange into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) TaskTimeOverRange¶
Deserializes the TaskTimeOverRange from a dictionary.
- class databricks.sdk.service.sql.TaskTimeOverRangeEntry(task_completed_time_ms: 'Optional[int]' = None)¶
- task_completed_time_ms: int | None = None¶
total task completion time in this time range, aggregated over all stages and jobs in the query
- as_dict() dict¶
Serializes the TaskTimeOverRangeEntry into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the TaskTimeOverRangeEntry into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) TaskTimeOverRangeEntry¶
Deserializes the TaskTimeOverRangeEntry from a dictionary.
- class databricks.sdk.service.sql.TerminationReason(code: 'Optional[TerminationReasonCode]' = None, parameters: 'Optional[Dict[str, str]]' = None, type: 'Optional[TerminationReasonType]' = None)¶
- code: TerminationReasonCode | None = None¶
status code indicating why the cluster was terminated
- parameters: Dict[str, str] | None = None¶
list of parameters that provide additional information about why the cluster was terminated
- type: TerminationReasonType | None = None¶
type of the termination
- as_dict() dict¶
Serializes the TerminationReason into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the TerminationReason into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) TerminationReason¶
Deserializes the TerminationReason from a dictionary.
- class databricks.sdk.service.sql.TerminationReasonCode¶
The status code indicating why the cluster was terminated
- ABUSE_DETECTED = "ABUSE_DETECTED"¶
- ACCESS_TOKEN_FAILURE = "ACCESS_TOKEN_FAILURE"¶
- ALLOCATION_TIMEOUT = "ALLOCATION_TIMEOUT"¶
- ALLOCATION_TIMEOUT_NODE_DAEMON_NOT_READY = "ALLOCATION_TIMEOUT_NODE_DAEMON_NOT_READY"¶
- ALLOCATION_TIMEOUT_NO_HEALTHY_AND_WARMED_UP_CLUSTERS = "ALLOCATION_TIMEOUT_NO_HEALTHY_AND_WARMED_UP_CLUSTERS"¶
- ALLOCATION_TIMEOUT_NO_HEALTHY_CLUSTERS = "ALLOCATION_TIMEOUT_NO_HEALTHY_CLUSTERS"¶
- ALLOCATION_TIMEOUT_NO_MATCHED_CLUSTERS = "ALLOCATION_TIMEOUT_NO_MATCHED_CLUSTERS"¶
- ALLOCATION_TIMEOUT_NO_READY_CLUSTERS = "ALLOCATION_TIMEOUT_NO_READY_CLUSTERS"¶
- ALLOCATION_TIMEOUT_NO_UNALLOCATED_CLUSTERS = "ALLOCATION_TIMEOUT_NO_UNALLOCATED_CLUSTERS"¶
- ALLOCATION_TIMEOUT_NO_WARMED_UP_CLUSTERS = "ALLOCATION_TIMEOUT_NO_WARMED_UP_CLUSTERS"¶
- ATTACH_PROJECT_FAILURE = "ATTACH_PROJECT_FAILURE"¶
- AWS_AUTHORIZATION_FAILURE = "AWS_AUTHORIZATION_FAILURE"¶
- AWS_INACCESSIBLE_KMS_KEY_FAILURE = "AWS_INACCESSIBLE_KMS_KEY_FAILURE"¶
- AWS_INSTANCE_PROFILE_UPDATE_FAILURE = "AWS_INSTANCE_PROFILE_UPDATE_FAILURE"¶
- AWS_INSUFFICIENT_FREE_ADDRESSES_IN_SUBNET_FAILURE = "AWS_INSUFFICIENT_FREE_ADDRESSES_IN_SUBNET_FAILURE"¶
- AWS_INSUFFICIENT_INSTANCE_CAPACITY_FAILURE = "AWS_INSUFFICIENT_INSTANCE_CAPACITY_FAILURE"¶
- AWS_INVALID_KEY_PAIR = "AWS_INVALID_KEY_PAIR"¶
- AWS_INVALID_KMS_KEY_STATE = "AWS_INVALID_KMS_KEY_STATE"¶
- AWS_MAX_SPOT_INSTANCE_COUNT_EXCEEDED_FAILURE = "AWS_MAX_SPOT_INSTANCE_COUNT_EXCEEDED_FAILURE"¶
- AWS_REQUEST_LIMIT_EXCEEDED = "AWS_REQUEST_LIMIT_EXCEEDED"¶
- AWS_RESOURCE_QUOTA_EXCEEDED = "AWS_RESOURCE_QUOTA_EXCEEDED"¶
- AWS_UNSUPPORTED_FAILURE = "AWS_UNSUPPORTED_FAILURE"¶
- AZURE_BYOK_KEY_PERMISSION_FAILURE = "AZURE_BYOK_KEY_PERMISSION_FAILURE"¶
- AZURE_EPHEMERAL_DISK_FAILURE = "AZURE_EPHEMERAL_DISK_FAILURE"¶
- AZURE_INVALID_DEPLOYMENT_TEMPLATE = "AZURE_INVALID_DEPLOYMENT_TEMPLATE"¶
- AZURE_OPERATION_NOT_ALLOWED_EXCEPTION = "AZURE_OPERATION_NOT_ALLOWED_EXCEPTION"¶
- AZURE_PACKED_DEPLOYMENT_PARTIAL_FAILURE = "AZURE_PACKED_DEPLOYMENT_PARTIAL_FAILURE"¶
- AZURE_QUOTA_EXCEEDED_EXCEPTION = "AZURE_QUOTA_EXCEEDED_EXCEPTION"¶
- AZURE_RESOURCE_MANAGER_THROTTLING = "AZURE_RESOURCE_MANAGER_THROTTLING"¶
- AZURE_RESOURCE_PROVIDER_THROTTLING = "AZURE_RESOURCE_PROVIDER_THROTTLING"¶
- AZURE_UNEXPECTED_DEPLOYMENT_TEMPLATE_FAILURE = "AZURE_UNEXPECTED_DEPLOYMENT_TEMPLATE_FAILURE"¶
- AZURE_VM_EXTENSION_FAILURE = "AZURE_VM_EXTENSION_FAILURE"¶
- AZURE_VNET_CONFIGURATION_FAILURE = "AZURE_VNET_CONFIGURATION_FAILURE"¶
- BOOTSTRAP_TIMEOUT = "BOOTSTRAP_TIMEOUT"¶
- BOOTSTRAP_TIMEOUT_CLOUD_PROVIDER_EXCEPTION = "BOOTSTRAP_TIMEOUT_CLOUD_PROVIDER_EXCEPTION"¶
- BOOTSTRAP_TIMEOUT_DUE_TO_MISCONFIG = "BOOTSTRAP_TIMEOUT_DUE_TO_MISCONFIG"¶
- BUDGET_POLICY_LIMIT_ENFORCEMENT_ACTIVATED = "BUDGET_POLICY_LIMIT_ENFORCEMENT_ACTIVATED"¶
- BUDGET_POLICY_RESOLUTION_FAILURE = "BUDGET_POLICY_RESOLUTION_FAILURE"¶
- CLOUD_ACCOUNT_POD_QUOTA_EXCEEDED = "CLOUD_ACCOUNT_POD_QUOTA_EXCEEDED"¶
- CLOUD_ACCOUNT_SETUP_FAILURE = "CLOUD_ACCOUNT_SETUP_FAILURE"¶
- CLOUD_OPERATION_CANCELLED = "CLOUD_OPERATION_CANCELLED"¶
- CLOUD_PROVIDER_DISK_SETUP_FAILURE = "CLOUD_PROVIDER_DISK_SETUP_FAILURE"¶
- CLOUD_PROVIDER_INSTANCE_NOT_LAUNCHED = "CLOUD_PROVIDER_INSTANCE_NOT_LAUNCHED"¶
- CLOUD_PROVIDER_LAUNCH_FAILURE = "CLOUD_PROVIDER_LAUNCH_FAILURE"¶
- CLOUD_PROVIDER_LAUNCH_FAILURE_DUE_TO_MISCONFIG = "CLOUD_PROVIDER_LAUNCH_FAILURE_DUE_TO_MISCONFIG"¶
- CLOUD_PROVIDER_RESOURCE_STOCKOUT = "CLOUD_PROVIDER_RESOURCE_STOCKOUT"¶
- CLOUD_PROVIDER_RESOURCE_STOCKOUT_DUE_TO_MISCONFIG = "CLOUD_PROVIDER_RESOURCE_STOCKOUT_DUE_TO_MISCONFIG"¶
- CLOUD_PROVIDER_SHUTDOWN = "CLOUD_PROVIDER_SHUTDOWN"¶
- CLUSTER_OPERATION_THROTTLED = "CLUSTER_OPERATION_THROTTLED"¶
- CLUSTER_OPERATION_TIMEOUT = "CLUSTER_OPERATION_TIMEOUT"¶
- COMMUNICATION_LOST = "COMMUNICATION_LOST"¶
- CONTAINER_LAUNCH_FAILURE = "CONTAINER_LAUNCH_FAILURE"¶
- CONTROL_PLANE_CONNECTION_FAILURE = "CONTROL_PLANE_CONNECTION_FAILURE"¶
- CONTROL_PLANE_CONNECTION_FAILURE_DUE_TO_MISCONFIG = "CONTROL_PLANE_CONNECTION_FAILURE_DUE_TO_MISCONFIG"¶
- CONTROL_PLANE_REQUEST_FAILURE = "CONTROL_PLANE_REQUEST_FAILURE"¶
- CONTROL_PLANE_REQUEST_FAILURE_DUE_TO_MISCONFIG = "CONTROL_PLANE_REQUEST_FAILURE_DUE_TO_MISCONFIG"¶
- DATABASE_CONNECTION_FAILURE = "DATABASE_CONNECTION_FAILURE"¶
- DATA_ACCESS_CONFIG_CHANGED = "DATA_ACCESS_CONFIG_CHANGED"¶
- DBFS_COMPONENT_UNHEALTHY = "DBFS_COMPONENT_UNHEALTHY"¶
- DBR_IMAGE_RESOLUTION_FAILURE = "DBR_IMAGE_RESOLUTION_FAILURE"¶
- DISASTER_RECOVERY_REPLICATION = "DISASTER_RECOVERY_REPLICATION"¶
- DNS_RESOLUTION_ERROR = "DNS_RESOLUTION_ERROR"¶
- DOCKER_CONTAINER_CREATION_EXCEPTION = "DOCKER_CONTAINER_CREATION_EXCEPTION"¶
- DOCKER_IMAGE_PULL_FAILURE = "DOCKER_IMAGE_PULL_FAILURE"¶
- DOCKER_IMAGE_TOO_LARGE_FOR_INSTANCE_EXCEPTION = "DOCKER_IMAGE_TOO_LARGE_FOR_INSTANCE_EXCEPTION"¶
- DOCKER_INVALID_OS_EXCEPTION = "DOCKER_INVALID_OS_EXCEPTION"¶
- DRIVER_EVICTION = "DRIVER_EVICTION"¶
- DRIVER_LAUNCH_TIMEOUT = "DRIVER_LAUNCH_TIMEOUT"¶
- DRIVER_NODE_UNREACHABLE = "DRIVER_NODE_UNREACHABLE"¶
- DRIVER_OUT_OF_DISK = "DRIVER_OUT_OF_DISK"¶
- DRIVER_OUT_OF_MEMORY = "DRIVER_OUT_OF_MEMORY"¶
- DRIVER_POD_CREATION_FAILURE = "DRIVER_POD_CREATION_FAILURE"¶
- DRIVER_UNEXPECTED_FAILURE = "DRIVER_UNEXPECTED_FAILURE"¶
- DRIVER_UNHEALTHY = "DRIVER_UNHEALTHY"¶
- DRIVER_UNREACHABLE = "DRIVER_UNREACHABLE"¶
- DRIVER_UNRESPONSIVE = "DRIVER_UNRESPONSIVE"¶
- DYNAMIC_SPARK_CONF_SIZE_EXCEEDED = "DYNAMIC_SPARK_CONF_SIZE_EXCEEDED"¶
- EOS_SPARK_IMAGE = "EOS_SPARK_IMAGE"¶
- EXECUTION_COMPONENT_UNHEALTHY = "EXECUTION_COMPONENT_UNHEALTHY"¶
- EXECUTOR_POD_UNSCHEDULED = "EXECUTOR_POD_UNSCHEDULED"¶
- GCP_API_RATE_QUOTA_EXCEEDED = "GCP_API_RATE_QUOTA_EXCEEDED"¶
- GCP_DENIED_BY_ORG_POLICY = "GCP_DENIED_BY_ORG_POLICY"¶
- GCP_FORBIDDEN = "GCP_FORBIDDEN"¶
- GCP_IAM_TIMEOUT = "GCP_IAM_TIMEOUT"¶
- GCP_INACCESSIBLE_KMS_KEY_FAILURE = "GCP_INACCESSIBLE_KMS_KEY_FAILURE"¶
- GCP_INSUFFICIENT_CAPACITY = "GCP_INSUFFICIENT_CAPACITY"¶
- GCP_IP_SPACE_EXHAUSTED = "GCP_IP_SPACE_EXHAUSTED"¶
- GCP_KMS_KEY_PERMISSION_DENIED = "GCP_KMS_KEY_PERMISSION_DENIED"¶
- GCP_NOT_FOUND = "GCP_NOT_FOUND"¶
- GCP_QUOTA_EXCEEDED = "GCP_QUOTA_EXCEEDED"¶
- GCP_RESOURCE_QUOTA_EXCEEDED = "GCP_RESOURCE_QUOTA_EXCEEDED"¶
- GCP_SERVICE_ACCOUNT_ACCESS_DENIED = "GCP_SERVICE_ACCOUNT_ACCESS_DENIED"¶
- GCP_SERVICE_ACCOUNT_DELETED = "GCP_SERVICE_ACCOUNT_DELETED"¶
- GCP_SERVICE_ACCOUNT_NOT_FOUND = "GCP_SERVICE_ACCOUNT_NOT_FOUND"¶
- GCP_SUBNET_NOT_READY = "GCP_SUBNET_NOT_READY"¶
- GCP_TRUSTED_IMAGE_PROJECTS_VIOLATED = "GCP_TRUSTED_IMAGE_PROJECTS_VIOLATED"¶
- GKE_BASED_CLUSTER_TERMINATION = "GKE_BASED_CLUSTER_TERMINATION"¶
- GLOBAL_INIT_SCRIPT_FAILURE = "GLOBAL_INIT_SCRIPT_FAILURE"¶
- HIVEMETASTORE_CONNECTIVITY_FAILURE = "HIVEMETASTORE_CONNECTIVITY_FAILURE"¶
- HIVE_METASTORE_PROVISIONING_FAILURE = "HIVE_METASTORE_PROVISIONING_FAILURE"¶
- IMAGE_PULL_PERMISSION_DENIED = "IMAGE_PULL_PERMISSION_DENIED"¶
- INACTIVITY = "INACTIVITY"¶
- INIT_CONTAINER_NOT_FINISHED = "INIT_CONTAINER_NOT_FINISHED"¶
- INIT_SCRIPT_FAILURE = "INIT_SCRIPT_FAILURE"¶
- INSTANCE_POOL_CLUSTER_FAILURE = "INSTANCE_POOL_CLUSTER_FAILURE"¶
- INSTANCE_POOL_MAX_CAPACITY_REACHED = "INSTANCE_POOL_MAX_CAPACITY_REACHED"¶
- INSTANCE_POOL_NOT_FOUND = "INSTANCE_POOL_NOT_FOUND"¶
- INSTANCE_UNREACHABLE = "INSTANCE_UNREACHABLE"¶
- INSTANCE_UNREACHABLE_DUE_TO_MISCONFIG = "INSTANCE_UNREACHABLE_DUE_TO_MISCONFIG"¶
- INTERNAL_CAPACITY_FAILURE = "INTERNAL_CAPACITY_FAILURE"¶
- INTERNAL_ERROR = "INTERNAL_ERROR"¶
- INVALID_ARGUMENT = "INVALID_ARGUMENT"¶
- INVALID_AWS_PARAMETER = "INVALID_AWS_PARAMETER"¶
- INVALID_INSTANCE_PLACEMENT_PROTOCOL = "INVALID_INSTANCE_PLACEMENT_PROTOCOL"¶
- INVALID_SPARK_IMAGE = "INVALID_SPARK_IMAGE"¶
- INVALID_WORKER_IMAGE_FAILURE = "INVALID_WORKER_IMAGE_FAILURE"¶
- IN_PENALTY_BOX = "IN_PENALTY_BOX"¶
- IP_EXHAUSTION_FAILURE = "IP_EXHAUSTION_FAILURE"¶
- JOB_FINISHED = "JOB_FINISHED"¶
- K8S_ACTIVE_POD_QUOTA_EXCEEDED = "K8S_ACTIVE_POD_QUOTA_EXCEEDED"¶
- K8S_AUTOSCALING_FAILURE = "K8S_AUTOSCALING_FAILURE"¶
- K8S_DBR_CLUSTER_LAUNCH_TIMEOUT = "K8S_DBR_CLUSTER_LAUNCH_TIMEOUT"¶
- LAZY_ALLOCATION_TIMEOUT = "LAZY_ALLOCATION_TIMEOUT"¶
- MAINTENANCE_MODE = "MAINTENANCE_MODE"¶
- METASTORE_COMPONENT_UNHEALTHY = "METASTORE_COMPONENT_UNHEALTHY"¶
- MTLS_PORT_CONNECTIVITY_FAILURE = "MTLS_PORT_CONNECTIVITY_FAILURE"¶
- NEPHOS_RESOURCE_MANAGEMENT = "NEPHOS_RESOURCE_MANAGEMENT"¶
- NETVISOR_SETUP_TIMEOUT = "NETVISOR_SETUP_TIMEOUT"¶
- NETWORK_CHECK_CONTROL_PLANE_FAILURE = "NETWORK_CHECK_CONTROL_PLANE_FAILURE"¶
- NETWORK_CHECK_CONTROL_PLANE_FAILURE_DUE_TO_MISCONFIG = "NETWORK_CHECK_CONTROL_PLANE_FAILURE_DUE_TO_MISCONFIG"¶
- NETWORK_CHECK_DNS_SERVER_FAILURE = "NETWORK_CHECK_DNS_SERVER_FAILURE"¶
- NETWORK_CHECK_DNS_SERVER_FAILURE_DUE_TO_MISCONFIG = "NETWORK_CHECK_DNS_SERVER_FAILURE_DUE_TO_MISCONFIG"¶
- NETWORK_CHECK_METADATA_ENDPOINT_FAILURE = "NETWORK_CHECK_METADATA_ENDPOINT_FAILURE"¶
- NETWORK_CHECK_METADATA_ENDPOINT_FAILURE_DUE_TO_MISCONFIG = "NETWORK_CHECK_METADATA_ENDPOINT_FAILURE_DUE_TO_MISCONFIG"¶
- NETWORK_CHECK_MULTIPLE_COMPONENTS_FAILURE = "NETWORK_CHECK_MULTIPLE_COMPONENTS_FAILURE"¶
- NETWORK_CHECK_MULTIPLE_COMPONENTS_FAILURE_DUE_TO_MISCONFIG = "NETWORK_CHECK_MULTIPLE_COMPONENTS_FAILURE_DUE_TO_MISCONFIG"¶
- NETWORK_CHECK_NIC_FAILURE = "NETWORK_CHECK_NIC_FAILURE"¶
- NETWORK_CHECK_NIC_FAILURE_DUE_TO_MISCONFIG = "NETWORK_CHECK_NIC_FAILURE_DUE_TO_MISCONFIG"¶
- NETWORK_CHECK_STORAGE_FAILURE = "NETWORK_CHECK_STORAGE_FAILURE"¶
- NETWORK_CHECK_STORAGE_FAILURE_DUE_TO_MISCONFIG = "NETWORK_CHECK_STORAGE_FAILURE_DUE_TO_MISCONFIG"¶
- NETWORK_CONFIGURATION_FAILURE = "NETWORK_CONFIGURATION_FAILURE"¶
- NFS_MOUNT_FAILURE = "NFS_MOUNT_FAILURE"¶
- NO_MATCHED_K8S = "NO_MATCHED_K8S"¶
- NO_MATCHED_K8S_TESTING_TAG = "NO_MATCHED_K8S_TESTING_TAG"¶
- NPIP_TUNNEL_SETUP_FAILURE = "NPIP_TUNNEL_SETUP_FAILURE"¶
- NPIP_TUNNEL_TOKEN_FAILURE = "NPIP_TUNNEL_TOKEN_FAILURE"¶
- POD_ASSIGNMENT_FAILURE = "POD_ASSIGNMENT_FAILURE"¶
- POD_SCHEDULING_FAILURE = "POD_SCHEDULING_FAILURE"¶
- RATE_LIMITED = "RATE_LIMITED"¶
- REQUEST_REJECTED = "REQUEST_REJECTED"¶
- REQUEST_THROTTLED = "REQUEST_THROTTLED"¶
- RESOURCE_USAGE_BLOCKED = "RESOURCE_USAGE_BLOCKED"¶
- SECRET_CREATION_FAILURE = "SECRET_CREATION_FAILURE"¶
- SECRET_PERMISSION_DENIED = "SECRET_PERMISSION_DENIED"¶
- SECRET_RESOLUTION_ERROR = "SECRET_RESOLUTION_ERROR"¶
- SECURITY_DAEMON_REGISTRATION_EXCEPTION = "SECURITY_DAEMON_REGISTRATION_EXCEPTION"¶
- SELF_BOOTSTRAP_FAILURE = "SELF_BOOTSTRAP_FAILURE"¶
- SERVERLESS_LONG_RUNNING_TERMINATED = "SERVERLESS_LONG_RUNNING_TERMINATED"¶
- SKIPPED_SLOW_NODES = "SKIPPED_SLOW_NODES"¶
- SLOW_IMAGE_DOWNLOAD = "SLOW_IMAGE_DOWNLOAD"¶
- SPARK_ERROR = "SPARK_ERROR"¶
- SPARK_IMAGE_DOWNLOAD_FAILURE = "SPARK_IMAGE_DOWNLOAD_FAILURE"¶
- SPARK_IMAGE_DOWNLOAD_THROTTLED = "SPARK_IMAGE_DOWNLOAD_THROTTLED"¶
- SPARK_IMAGE_NOT_FOUND = "SPARK_IMAGE_NOT_FOUND"¶
- SPARK_STARTUP_FAILURE = "SPARK_STARTUP_FAILURE"¶
- SPOT_INSTANCE_TERMINATION = "SPOT_INSTANCE_TERMINATION"¶
- SSH_BOOTSTRAP_FAILURE = "SSH_BOOTSTRAP_FAILURE"¶
- STORAGE_DOWNLOAD_FAILURE = "STORAGE_DOWNLOAD_FAILURE"¶
- STORAGE_DOWNLOAD_FAILURE_DUE_TO_MISCONFIG = "STORAGE_DOWNLOAD_FAILURE_DUE_TO_MISCONFIG"¶
- STORAGE_DOWNLOAD_FAILURE_SLOW = "STORAGE_DOWNLOAD_FAILURE_SLOW"¶
- STORAGE_DOWNLOAD_FAILURE_THROTTLED = "STORAGE_DOWNLOAD_FAILURE_THROTTLED"¶
- STS_CLIENT_SETUP_FAILURE = "STS_CLIENT_SETUP_FAILURE"¶
- SUBNET_EXHAUSTED_FAILURE = "SUBNET_EXHAUSTED_FAILURE"¶
- TEMPORARILY_UNAVAILABLE = "TEMPORARILY_UNAVAILABLE"¶
- TRIAL_EXPIRED = "TRIAL_EXPIRED"¶
- UNEXPECTED_LAUNCH_FAILURE = "UNEXPECTED_LAUNCH_FAILURE"¶
- UNEXPECTED_POD_RECREATION = "UNEXPECTED_POD_RECREATION"¶
- UNKNOWN = "UNKNOWN"¶
- UNSUPPORTED_INSTANCE_TYPE = "UNSUPPORTED_INSTANCE_TYPE"¶
- UPDATE_INSTANCE_PROFILE_FAILURE = "UPDATE_INSTANCE_PROFILE_FAILURE"¶
- USAGE_POLICY_ENTITLEMENT_DENIED = "USAGE_POLICY_ENTITLEMENT_DENIED"¶
- USER_INITIATED_VM_TERMINATION = "USER_INITIATED_VM_TERMINATION"¶
- USER_REQUEST = "USER_REQUEST"¶
- WORKER_SETUP_FAILURE = "WORKER_SETUP_FAILURE"¶
- WORKSPACE_CANCELLED_ERROR = "WORKSPACE_CANCELLED_ERROR"¶
- WORKSPACE_CONFIGURATION_ERROR = "WORKSPACE_CONFIGURATION_ERROR"¶
- WORKSPACE_UPDATE = "WORKSPACE_UPDATE"¶
- class databricks.sdk.service.sql.TerminationReasonType¶
type of the termination
- CLIENT_ERROR = "CLIENT_ERROR"¶
- CLOUD_FAILURE = "CLOUD_FAILURE"¶
- SERVICE_FAULT = "SERVICE_FAULT"¶
- SUCCESS = "SUCCESS"¶
- class databricks.sdk.service.sql.TextValue(value: 'Optional[str]' = None)¶
- value: str | None = None¶
- as_dict() dict¶
Serializes the TextValue into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the TextValue into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.TimeRange(end_time_ms: 'Optional[int]' = None, start_time_ms: 'Optional[int]' = None)¶
- end_time_ms: int | None = None¶
The end time in milliseconds.
- start_time_ms: int | None = None¶
The start time in milliseconds.
- as_dict() dict¶
Serializes the TimeRange into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the TimeRange into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.TransferOwnershipObjectId(new_owner: 'Optional[str]' = None)¶
- new_owner: str | None = None¶
Email address for the new owner, who must exist in the workspace.
- as_dict() dict¶
Serializes the TransferOwnershipObjectId into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the TransferOwnershipObjectId into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) TransferOwnershipObjectId¶
Deserializes the TransferOwnershipObjectId from a dictionary.
- class databricks.sdk.service.sql.UpdateAlertRequestAlert(condition: 'Optional[AlertCondition]' = None, custom_body: 'Optional[str]' = None, custom_subject: 'Optional[str]' = None, display_name: 'Optional[str]' = None, notify_on_ok: 'Optional[bool]' = None, owner_user_name: 'Optional[str]' = None, query_id: 'Optional[str]' = None, seconds_to_retrigger: 'Optional[int]' = None)¶
- condition: AlertCondition | None = None¶
Trigger conditions of the alert.
- custom_body: str | None = None¶
Custom body of alert notification, if it exists. See [here] for custom templating instructions.
- custom_subject: str | None = None¶
Custom subject of alert notification, if it exists. This can include email subject entries and Slack notification headers, for example. See [here] for custom templating instructions.
- display_name: str | None = None¶
The display name of the alert.
- notify_on_ok: bool | None = None¶
Whether to notify alert subscribers when alert returns back to normal.
- owner_user_name: str | None = None¶
The owner’s username. This field is set to “Unavailable” if the user has been deleted.
- query_id: str | None = None¶
UUID of the query attached to the alert.
- seconds_to_retrigger: int | None = None¶
Number of seconds an alert must wait after being triggered to rearm itself. After rearming, it can be triggered again. If 0 or not specified, the alert will not be triggered again.
- as_dict() dict¶
Serializes the UpdateAlertRequestAlert into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the UpdateAlertRequestAlert into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpdateAlertRequestAlert¶
Deserializes the UpdateAlertRequestAlert from a dictionary.
- class databricks.sdk.service.sql.UpdateQueryRequestQuery(apply_auto_limit: 'Optional[bool]' = None, catalog: 'Optional[str]' = None, description: 'Optional[str]' = None, display_name: 'Optional[str]' = None, owner_user_name: 'Optional[str]' = None, parameters: 'Optional[List[QueryParameter]]' = None, query_text: 'Optional[str]' = None, run_as_mode: 'Optional[RunAsMode]' = None, schema: 'Optional[str]' = None, tags: 'Optional[List[str]]' = None, warehouse_id: 'Optional[str]' = None)¶
- apply_auto_limit: bool | None = None¶
Whether to apply a 1000 row limit to the query result.
- catalog: str | None = None¶
Name of the catalog where this query will be executed.
- description: str | None = None¶
General description that conveys additional information about this query such as usage notes.
- display_name: str | None = None¶
Display name of the query that appears in list views, widget headings, and on the query page.
- owner_user_name: str | None = None¶
Username of the user that owns the query.
- parameters: List[QueryParameter] | None = None¶
List of query parameter definitions.
- query_text: str | None = None¶
Text of the query to be run.
- schema: str | None = None¶
Name of the schema where this query will be executed.
- tags: List[str] | None = None¶
- warehouse_id: str | None = None¶
ID of the SQL warehouse attached to the query.
- as_dict() dict¶
Serializes the UpdateQueryRequestQuery into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the UpdateQueryRequestQuery into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpdateQueryRequestQuery¶
Deserializes the UpdateQueryRequestQuery from a dictionary.
- class databricks.sdk.service.sql.UpdateResponse¶
- as_dict() dict¶
Serializes the UpdateResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the UpdateResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpdateResponse¶
Deserializes the UpdateResponse from a dictionary.
- class databricks.sdk.service.sql.UpdateVisualizationRequestVisualization(display_name: 'Optional[str]' = None, serialized_options: 'Optional[str]' = None, serialized_query_plan: 'Optional[str]' = None, type: 'Optional[str]' = None)¶
- display_name: str | None = None¶
The display name of the visualization.
- serialized_options: str | None = None¶
The visualization options varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying visualization options directly.
- serialized_query_plan: str | None = None¶
The visualization query plan varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying the visualization query plan directly.
- type: str | None = None¶
The type of visualization: counter, table, funnel, and so on.
- as_dict() dict¶
Serializes the UpdateVisualizationRequestVisualization into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the UpdateVisualizationRequestVisualization into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) UpdateVisualizationRequestVisualization¶
Deserializes the UpdateVisualizationRequestVisualization from a dictionary.
- class databricks.sdk.service.sql.User(email: 'Optional[str]' = None, id: 'Optional[int]' = None, name: 'Optional[str]' = None)¶
- email: str | None = None¶
- id: int | None = None¶
- name: str | None = None¶
- as_dict() dict¶
Serializes the User into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the User into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.Visualization(create_time: 'Optional[str]' = None, display_name: 'Optional[str]' = None, id: 'Optional[str]' = None, query_id: 'Optional[str]' = None, serialized_options: 'Optional[str]' = None, serialized_query_plan: 'Optional[str]' = None, type: 'Optional[str]' = None, update_time: 'Optional[str]' = None)¶
- create_time: str | None = None¶
The timestamp indicating when the visualization was created.
- display_name: str | None = None¶
The display name of the visualization.
- id: str | None = None¶
UUID identifying the visualization.
- query_id: str | None = None¶
UUID of the query that the visualization is attached to.
- serialized_options: str | None = None¶
The visualization options varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying visualization options directly.
- serialized_query_plan: str | None = None¶
The visualization query plan varies widely from one visualization type to the next and is unsupported. Databricks does not recommend modifying the visualization query plan directly.
- type: str | None = None¶
The type of visualization: counter, table, funnel, and so on.
- update_time: str | None = None¶
The timestamp indicating when the visualization was updated.
- as_dict() dict¶
Serializes the Visualization into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Visualization into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) Visualization¶
Deserializes the Visualization from a dictionary.
- class databricks.sdk.service.sql.WarehouseAccessControlRequest(group_name: 'Optional[str]' = None, permission_level: 'Optional[WarehousePermissionLevel]' = None, service_principal_name: 'Optional[str]' = None, user_name: 'Optional[str]' = None)¶
- group_name: str | None = None¶
name of the group
- permission_level: WarehousePermissionLevel | None = None¶
- service_principal_name: str | None = None¶
application ID of a service principal
- user_name: str | None = None¶
name of the user
- as_dict() dict¶
Serializes the WarehouseAccessControlRequest into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WarehouseAccessControlRequest into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WarehouseAccessControlRequest¶
Deserializes the WarehouseAccessControlRequest from a dictionary.
- class databricks.sdk.service.sql.WarehouseAccessControlResponse(all_permissions: 'Optional[List[WarehousePermission]]' = None, display_name: 'Optional[str]' = None, group_name: 'Optional[str]' = None, service_principal_name: 'Optional[str]' = None, user_name: 'Optional[str]' = None)¶
- all_permissions: List[WarehousePermission] | None = None¶
All permissions.
- display_name: str | None = None¶
Display name of the user or service principal.
- group_name: str | None = None¶
name of the group
- service_principal_name: str | None = None¶
Name of the service principal.
- user_name: str | None = None¶
name of the user
- as_dict() dict¶
Serializes the WarehouseAccessControlResponse into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WarehouseAccessControlResponse into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WarehouseAccessControlResponse¶
Deserializes the WarehouseAccessControlResponse from a dictionary.
- class databricks.sdk.service.sql.WarehousePermission(inherited: 'Optional[bool]' = None, inherited_from_object: 'Optional[List[str]]' = None, permission_level: 'Optional[WarehousePermissionLevel]' = None)¶
- inherited: bool | None = None¶
- inherited_from_object: List[str] | None = None¶
- permission_level: WarehousePermissionLevel | None = None¶
- as_dict() dict¶
Serializes the WarehousePermission into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WarehousePermission into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WarehousePermission¶
Deserializes the WarehousePermission from a dictionary.
- class databricks.sdk.service.sql.WarehousePermissionLevel¶
Permission level
- CAN_MANAGE = "CAN_MANAGE"¶
- CAN_MONITOR = "CAN_MONITOR"¶
- CAN_USE = "CAN_USE"¶
- CAN_VIEW = "CAN_VIEW"¶
- IS_OWNER = "IS_OWNER"¶
- class databricks.sdk.service.sql.WarehousePermissions(access_control_list: 'Optional[List[WarehouseAccessControlResponse]]' = None, object_id: 'Optional[str]' = None, object_type: 'Optional[str]' = None)¶
- access_control_list: List[WarehouseAccessControlResponse] | None = None¶
- object_id: str | None = None¶
- object_type: str | None = None¶
- as_dict() dict¶
Serializes the WarehousePermissions into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WarehousePermissions into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WarehousePermissions¶
Deserializes the WarehousePermissions from a dictionary.
- class databricks.sdk.service.sql.WarehousePermissionsDescription(description: 'Optional[str]' = None, permission_level: 'Optional[WarehousePermissionLevel]' = None)¶
- description: str | None = None¶
- permission_level: WarehousePermissionLevel | None = None¶
- as_dict() dict¶
Serializes the WarehousePermissionsDescription into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WarehousePermissionsDescription into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WarehousePermissionsDescription¶
Deserializes the WarehousePermissionsDescription from a dictionary.
- class databricks.sdk.service.sql.WarehouseTypePair(enabled: bool | None = None, warehouse_type: WarehouseTypePairWarehouseType | None = None)¶
Configuration values to enable or disable the access to specific warehouse types in the
workspace.
- enabled: bool | None = None¶
If set to false the specific warehouse type will not be be allowed as a value for warehouse_type in CreateWarehouse and EditWarehouse
- warehouse_type: WarehouseTypePairWarehouseType | None = None¶
- as_dict() dict¶
Serializes the WarehouseTypePair into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WarehouseTypePair into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WarehouseTypePair¶
Deserializes the WarehouseTypePair from a dictionary.
- class databricks.sdk.service.sql.WarehouseTypePairWarehouseType¶
- CLASSIC = "CLASSIC"¶
- PRO = "PRO"¶
- TYPE_UNSPECIFIED = "TYPE_UNSPECIFIED"¶
- class databricks.sdk.service.sql.Widget(id: 'Optional[str]' = None, options: 'Optional[WidgetOptions]' = None, visualization: 'Optional[LegacyVisualization]' = None, width: 'Optional[int]' = None)¶
- id: str | None = None¶
The unique ID for this widget.
- options: WidgetOptions | None = None¶
- visualization: LegacyVisualization | None = None¶
The visualization description API changes frequently and is unsupported. You can duplicate a visualization by copying description objects received _from the API_ and then using them to create a new one with a POST request to the same endpoint. Databricks does not recommend constructing ad-hoc visualizations entirely in JSON.
- width: int | None = None¶
Unused field.
- as_dict() dict¶
Serializes the Widget into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the Widget into a shallow dictionary of its immediate attributes.
- class databricks.sdk.service.sql.WidgetOptions(created_at: 'Optional[str]' = None, description: 'Optional[str]' = None, is_hidden: 'Optional[bool]' = None, parameter_mappings: 'Optional[Any]' = None, position: 'Optional[WidgetPosition]' = None, title: 'Optional[str]' = None, updated_at: 'Optional[str]' = None)¶
- created_at: str | None = None¶
Timestamp when this object was created
- description: str | None = None¶
Custom description of the widget
Whether this widget is hidden on the dashboard.
- parameter_mappings: Any | None = None¶
How parameters used by the visualization in this widget relate to other widgets on the dashboard. Databricks does not recommend modifying this definition in JSON.
- position: WidgetPosition | None = None¶
Coordinates of this widget on a dashboard. This portion of the API changes frequently and is unsupported.
- title: str | None = None¶
Custom title of the widget
- updated_at: str | None = None¶
Timestamp of the last time this object was updated.
- as_dict() dict¶
Serializes the WidgetOptions into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WidgetOptions into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WidgetOptions¶
Deserializes the WidgetOptions from a dictionary.
- class databricks.sdk.service.sql.WidgetPosition(auto_height: bool | None = None, col: int | None = None, row: int | None = None, size_x: int | None = None, size_y: int | None = None)¶
Coordinates of this widget on a dashboard. This portion of the API changes frequently and is unsupported.
- auto_height: bool | None = None¶
reserved for internal use
- col: int | None = None¶
column in the dashboard grid. Values start with 0
- row: int | None = None¶
row in the dashboard grid. Values start with 0
- size_x: int | None = None¶
width of the widget measured in dashboard grid cells
- size_y: int | None = None¶
height of the widget measured in dashboard grid cells
- as_dict() dict¶
Serializes the WidgetPosition into a dictionary suitable for use as a JSON request body.
- as_shallow_dict() dict¶
Serializes the WidgetPosition into a shallow dictionary of its immediate attributes.
- classmethod from_dict(d: Dict[str, Any]) WidgetPosition¶
Deserializes the WidgetPosition from a dictionary.