Skip to content

Check UCX and LSQL for backwards compatibility#78

Merged
nfx merged 1 commit intomainfrom
nfx-patch-1
Mar 30, 2024
Merged

Check UCX and LSQL for backwards compatibility#78
nfx merged 1 commit intomainfrom
nfx-patch-1

Conversation

@nfx
Copy link
Copy Markdown
Collaborator

@nfx nfx commented Mar 30, 2024

Automate downstream unit testing. See databrickslabs/sandbox#141 for more details.

Automate downstream unit testing. See databrickslabs/sandbox#141 for more details.
@codecov
Copy link
Copy Markdown

codecov bot commented Mar 30, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 84.58%. Comparing base (073c922) to head (b90cbdb).

Additional details and impacted files
@@           Coverage Diff           @@
##             main      #78   +/-   ##
=======================================
  Coverage   84.58%   84.58%           
=======================================
  Files           7        7           
  Lines         532      532           
  Branches      105      105           
=======================================
  Hits          450      450           
  Misses         50       50           
  Partials       32       32           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@nfx nfx merged commit a079b6e into main Mar 30, 2024
@nfx nfx deleted the nfx-patch-1 branch March 30, 2024 14:35
@github-actions
Copy link
Copy Markdown

❌ 17/18 passed, 1 failed, 2 skipped, 24m23s total

❌ test_overwrite: databricks.sdk.errors.platform.BadRequest: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: (838ms)
databricks.sdk.errors.platform.BadRequest: [INSUFFICIENT_PERMISSIONS] Insufficient privileges:
User does not have permission CREATE,USAGE on database `default`.
14:34 DEBUG [databricks.sdk] Loaded from environment
14:34 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
14:34 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
14:34 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
14:34 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw8] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
14:34 DEBUG [databricks.sdk] Loaded from environment
14:34 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
14:34 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
14:34 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
14:34 INFO [databricks.sdk] Using Databricks Metadata Service authentication
14:34 DEBUG [databricks.labs.lsql.backends] [api][execute] CREATE TABLE IF NOT EXISTS default.foo (first STRING NOT NULL, second BOOLEAN NOT NULL) USING DE... (3 more bytes)
14:34 DEBUG [databricks.labs.lsql.core] Executing SQL statement: CREATE TABLE IF NOT EXISTS default.foo (first STRING NOT NULL, second BOOLEAN NOT NULL) USING DELTA
14:34 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
>   "format": "JSON_ARRAY",
>   "statement": "CREATE TABLE IF NOT EXISTS default.foo (first STRING NOT NULL, second BOOLEAN NOT NULL) USING DE... (3 more bytes)",
>   "warehouse_id": "TEST_DEFAULT_WAREHOUSE_ID"
> }
< 200 OK
< {
<   "statement_id": "01eeeea2-9feb-1f81-bd52-faeccbc60638",
<   "status": {
<     "error": {
<       "error_code": "BAD_REQUEST",
<       "message": "[INSUFFICIENT_PERMISSIONS] Insufficient privileges:\nUser does not have permission CREATE,USAGE o... (21 more bytes)"
<     },
<     "state": "FAILED"
<   }
< }
14:34 DEBUG [databricks.sdk] Loaded from environment
14:34 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
14:34 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
14:34 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
14:34 INFO [databricks.sdk] Using Databricks Metadata Service authentication
14:34 DEBUG [databricks.labs.lsql.backends] [api][execute] CREATE TABLE IF NOT EXISTS default.foo (first STRING NOT NULL, second BOOLEAN NOT NULL) USING DE... (3 more bytes)
14:34 DEBUG [databricks.labs.lsql.core] Executing SQL statement: CREATE TABLE IF NOT EXISTS default.foo (first STRING NOT NULL, second BOOLEAN NOT NULL) USING DELTA
14:34 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
>   "format": "JSON_ARRAY",
>   "statement": "CREATE TABLE IF NOT EXISTS default.foo (first STRING NOT NULL, second BOOLEAN NOT NULL) USING DE... (3 more bytes)",
>   "warehouse_id": "TEST_DEFAULT_WAREHOUSE_ID"
> }
< 200 OK
< {
<   "statement_id": "01eeeea2-9feb-1f81-bd52-faeccbc60638",
<   "status": {
<     "error": {
<       "error_code": "BAD_REQUEST",
<       "message": "[INSUFFICIENT_PERMISSIONS] Insufficient privileges:\nUser does not have permission CREATE,USAGE o... (21 more bytes)"
<     },
<     "state": "FAILED"
<   }
< }
[gw8] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python

Running from acceptance #49

nfx added a commit that referenced this pull request Apr 2, 2024
* Check UCX and LSQL for backwards compatibility ([#78](#78)). In this release, we introduce a new GitHub Actions workflow, downstreams.yml, which automates unit testing for downstream projects upon changes made to the upstream project. The workflow runs on pull requests, merge groups, and pushes to the main branch and sets permissions for id-token, contents, and pull-requests. It includes a compatibility job that runs on Ubuntu, checks out the code, sets up Python, installs the toolchain, and accepts downstream projects using the databrickslabs/sandbox/downstreams action. The job matrix includes two downstream projects, ucx and remorph, and uses the build cache to speed up the pip install step. This feature ensures that changes to the upstream project do not break compatibility with downstream projects, maintaining a stable and reliable library for software engineers.
* Fixed `Builder` object has no attribute `sdk_config` error ([#86](#86)). In this release, we've resolved a `Builder` object has no attribute `sdk_config` error that occurred when initializing a Spark session using the `DatabricksSession.builder` method. The issue was caused by using dot notation to access the `sdk_config` attribute, which is incorrect. This has been updated to the correct syntax of `sdkConfig`. This change enables successful creation of the Spark session, preventing the error from recurring. The `DatabricksSession` class and its methods, such as `getOrCreate`, continue to be used for interacting with Databricks clusters and workspaces, while the `WorkspaceClient` class manages Databricks resources within a workspace.

Dependency updates:

 * Bump codecov/codecov-action from 1 to 4 ([#84](#84)).
 * Bump actions/setup-python from 4 to 5 ([#83](#83)).
 * Bump actions/checkout from 2.5.0 to 4.1.2 ([#81](#81)).
 * Bump softprops/action-gh-release from 1 to 2 ([#80](#80)).
@nfx nfx mentioned this pull request Apr 2, 2024
nfx added a commit that referenced this pull request Apr 2, 2024
* Check UCX and LSQL for backwards compatibility
([#78](#78)). In this
release, we introduce a new GitHub Actions workflow, downstreams.yml,
which automates unit testing for downstream projects upon changes made
to the upstream project. The workflow runs on pull requests, merge
groups, and pushes to the main branch and sets permissions for id-token,
contents, and pull-requests. It includes a compatibility job that runs
on Ubuntu, checks out the code, sets up Python, installs the toolchain,
and accepts downstream projects using the
databrickslabs/sandbox/downstreams action. The job matrix includes two
downstream projects, ucx and remorph, and uses the build cache to speed
up the pip install step. This feature ensures that changes to the
upstream project do not break compatibility with downstream projects,
maintaining a stable and reliable library for software engineers.
* Fixed `Builder` object has no attribute `sdk_config` error
([#86](#86)). In this
release, we've resolved a `Builder` object has no attribute `sdk_config`
error that occurred when initializing a Spark session using the
`DatabricksSession.builder` method. The issue was caused by using dot
notation to access the `sdk_config` attribute, which is incorrect. This
has been updated to the correct syntax of `sdkConfig`. This change
enables successful creation of the Spark session, preventing the error
from recurring. The `DatabricksSession` class and its methods, such as
`getOrCreate`, continue to be used for interacting with Databricks
clusters and workspaces, while the `WorkspaceClient` class manages
Databricks resources within a workspace.

Dependency updates:

* Bump codecov/codecov-action from 1 to 4
([#84](#84)).
* Bump actions/setup-python from 4 to 5
([#83](#83)).
* Bump actions/checkout from 2.5.0 to 4.1.2
([#81](#81)).
* Bump softprops/action-gh-release from 1 to 2
([#80](#80)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant