-
Notifications
You must be signed in to change notification settings - Fork 6
Permalink
Choose a base ref
{{ refName }}
default
Choose a head ref
{{ refName }}
default
Comparing changes
Choose two branches to see what’s changed or to start a new pull request.
If you need to, you can also or
learn more about diff comparisons.
Open a pull request
Create a new pull request by comparing changes across two branches. If you need to, you can also .
Learn more about diff comparisons here.
base repository: databrickslabs/lsql
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.3.0
Could not load branches
Nothing to show
Loading
Could not load tags
Nothing to show
{{ refName }}
default
Loading
...
head repository: databrickslabs/lsql
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v0.4.1
Could not load branches
Nothing to show
Loading
Could not load tags
Nothing to show
{{ refName }}
default
Loading
- 13 commits
- 11 files changed
- 5 contributors
Commits on Mar 30, 2024
-
Check UCX and LSQL for backwards compatibility (#78)
Automate downstream unit testing. See databrickslabs/sandbox#141 for more details.
Configuration menu - View commit details
-
Copy full SHA for a079b6e - Browse repository at this point
Copy the full SHA a079b6eView commit details -
Configuration menu - View commit details
-
Copy full SHA for d7698cf - Browse repository at this point
Copy the full SHA d7698cfView commit details
Commits on Mar 31, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 5a2bafe - Browse repository at this point
Copy the full SHA 5a2bafeView commit details -
Configuration menu - View commit details
-
Copy full SHA for f318e71 - Browse repository at this point
Copy the full SHA f318e71View commit details -
Configuration menu - View commit details
-
Copy full SHA for b1c3016 - Browse repository at this point
Copy the full SHA b1c3016View commit details -
Configuration menu - View commit details
-
Copy full SHA for f64ba13 - Browse repository at this point
Copy the full SHA f64ba13View commit details -
Configuration menu - View commit details
-
Copy full SHA for 830cc65 - Browse repository at this point
Copy the full SHA 830cc65View commit details
Commits on Apr 1, 2024
-
Fix 'Builder' object has no attribute 'sdk_config' error (#86)
Fix 'Builder' object has no attribute 'sdk_config' error.
Configuration menu - View commit details
-
Copy full SHA for 12f6d34 - Browse repository at this point
Copy the full SHA 12f6d34View commit details
Commits on Apr 2, 2024
-
* Check UCX and LSQL for backwards compatibility ([#78](#78)). In this release, we introduce a new GitHub Actions workflow, downstreams.yml, which automates unit testing for downstream projects upon changes made to the upstream project. The workflow runs on pull requests, merge groups, and pushes to the main branch and sets permissions for id-token, contents, and pull-requests. It includes a compatibility job that runs on Ubuntu, checks out the code, sets up Python, installs the toolchain, and accepts downstream projects using the databrickslabs/sandbox/downstreams action. The job matrix includes two downstream projects, ucx and remorph, and uses the build cache to speed up the pip install step. This feature ensures that changes to the upstream project do not break compatibility with downstream projects, maintaining a stable and reliable library for software engineers. * Fixed `Builder` object has no attribute `sdk_config` error ([#86](#86)). In this release, we've resolved a `Builder` object has no attribute `sdk_config` error that occurred when initializing a Spark session using the `DatabricksSession.builder` method. The issue was caused by using dot notation to access the `sdk_config` attribute, which is incorrect. This has been updated to the correct syntax of `sdkConfig`. This change enables successful creation of the Spark session, preventing the error from recurring. The `DatabricksSession` class and its methods, such as `getOrCreate`, continue to be used for interacting with Databricks clusters and workspaces, while the `WorkspaceClient` class manages Databricks resources within a workspace. Dependency updates: * Bump codecov/codecov-action from 1 to 4 ([#84](#84)). * Bump actions/setup-python from 4 to 5 ([#83](#83)). * Bump actions/checkout from 2.5.0 to 4.1.2 ([#81](#81)). * Bump softprops/action-gh-release from 1 to 2 ([#80](#80)).
Configuration menu - View commit details
-
Copy full SHA for 155bea0 - Browse repository at this point
Copy the full SHA 155bea0View commit details
Commits on Apr 11, 2024
-
Added catalog and schema parameters to execute and fetch. (#90)
Added catalog and schema parameters to the fetch and execute command. Created unit and integration tests.
Configuration menu - View commit details
-
Copy full SHA for 91978c0 - Browse repository at this point
Copy the full SHA 91978c0View commit details -
* Added catalog and schema parameters to execute and fetch ([#90](#90)). In this release, we have added optional `catalog` and `schema` parameters to the `execute` and `fetch` methods in the `SqlBackend` abstract base class, allowing for more flexibility when executing SQL statements in specific catalogs and schemas. These updates include new method signatures and their respective implementations in the `SparkSqlBackend` and `DatabricksSqlBackend` classes. The new parameters control the catalog and schema used by the `SparkSession` instance in the `SparkSqlBackend` class and the `SqlClient` instance in the `DatabricksSqlBackend` class. This enhancement enables better functionality in multi-catalog and multi-schema environments. Additionally, this change comes with unit tests and integration tests to ensure proper functionality. The new parameters can be used when calling the `execute` and `fetch` methods. For example, with a `SparkSqlBackend` instance `spark_backend`, you can execute a SQL statement in a specific catalog and schema with the following code: `spark_backend.execute("SELECT * FROM my_table", catalog="my_catalog", schema="my_schema")`. Similarly, the `fetch` method can also be used with the new parameters.
Configuration menu - View commit details
-
Copy full SHA for 8f3d164 - Browse repository at this point
Copy the full SHA 8f3d164View commit details
Commits on Apr 12, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 9a3517b - Browse repository at this point
Copy the full SHA 9a3517bView commit details -
* Fixing ovewrite integration tests ([#92](#92)). A new enhancement has been implemented for the `overwrite` feature's integration tests, addressing a concern with write operations. Two new variables, `catalog` and "schema", have been incorporated using the `env_or_skip` function. These variables are utilized in the `save_table` method, which is now invoked twice with the same table, once with the `append` and once with the `overwrite` option. The data in the table is retrieved and checked for accuracy after each call, employing the updated `Row` class with revised field names `first` and "second", formerly `name` and "id". This modification ensures the proper operation of the `overwrite` feature during integration tests and resolves any related issues. The commit message `Fixing overwrite integration tests` signifies this change.
Configuration menu - View commit details
-
Copy full SHA for 5782b23 - Browse repository at this point
Copy the full SHA 5782b23View commit details
Loading
This comparison is taking too long to generate.
Unfortunately it looks like we can’t render this comparison for you right now. It might be too big, or there might be something weird with your repository.
You can try running this command locally to see the comparison on your machine:
git diff v0.3.0...v0.4.1