-
Notifications
You must be signed in to change notification settings - Fork 15
Permalink
Choose a base ref
{{ refName }}
default
Choose a head ref
{{ refName }}
default
Comparing changes
Choose two branches to see what’s changed or to start a new pull request.
If you need to, you can also or
learn more about diff comparisons.
Open a pull request
Create a new pull request by comparing changes across two branches. If you need to, you can also .
Learn more about diff comparisons here.
base repository: databrickslabs/blueprint
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.2.4
Could not load branches
Nothing to show
Loading
Could not load tags
Nothing to show
{{ refName }}
default
Loading
...
head repository: databrickslabs/blueprint
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v0.3.0
Could not load branches
Nothing to show
Loading
Could not load tags
Nothing to show
{{ refName }}
default
Loading
- 11 commits
- 39 files changed
- 2 contributors
Commits on Feb 13, 2024
-
Configuration menu - View commit details
-
Copy full SHA for e6a08fb - Browse repository at this point
Copy the full SHA e6a08fbView commit details -
Configuration menu - View commit details
-
Copy full SHA for c7e47ab - Browse repository at this point
Copy the full SHA c7e47abView commit details
Commits on Feb 20, 2024
-
Use
databrickslabs/sandbox/acceptanceaction (#45)This PR onboards this repo for a new experimental github action
Configuration menu - View commit details
-
Copy full SHA for 22fc1a8 - Browse repository at this point
Copy the full SHA 22fc1a8View commit details
Commits on Feb 21, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 9cbc6f8 - Browse repository at this point
Copy the full SHA 9cbc6f8View commit details
Commits on Feb 23, 2024
-
Made
test_existing_installations_are_detectedmore resilient (#51)Retry for 15 seconds if both installation are not detected
Configuration menu - View commit details
-
Copy full SHA for aa57141 - Browse repository at this point
Copy the full SHA aa57141View commit details
Commits on Feb 24, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 01d9467 - Browse repository at this point
Copy the full SHA 01d9467View commit details
Commits on Feb 28, 2024
-
Configuration menu - View commit details
-
Copy full SHA for b4dd5ab - Browse repository at this point
Copy the full SHA b4dd5abView commit details
Commits on Mar 1, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 24e62ef - Browse repository at this point
Copy the full SHA 24e62efView commit details
Commits on Mar 2, 2024
-
Added automated upgrade framework (#50)
This release introduces several improvements and additions to the databricks.labs.blueprint module. It includes an automated upgrade framework with a new `upgrades.py` file, providing mechanisms for managing and applying upgrades to the product. The `ProductInfo` class now includes a `checkout_root` method, a `as_semver` property, a `wheels` property, and a `_infer_version_file` class method for better handling of versioning and wheel building processes. A new exception, `SingleSourceVersionError`, has been added to indicate that the version file is not found in the given project. Additionally, test code organization has been improved with new directories and files for fixtures and unit tests, along with new test cases and functions for the upgrades functionality. The `test_wheels.py` file has also been updated with new functions to check the version of the Databricks SDK and handle cases where the version marker is missing or does not contain the `__version__` variable. These changes enhance the functionality, maintainability, and adaptability of the system.
Configuration menu - View commit details
-
Copy full SHA for c8a74f4 - Browse repository at this point
Copy the full SHA c8a74f4View commit details -
Added brute-forcing
SerdeErrorwithas_dict()andfrom_dict()(#58) In the rare circumstances when you cannot use [@DataClass](#loading-dataclass-configuration) or you get `SerdeError` that you cannot explain, you can implement `from_dict(cls, raw: dict) -> 'T'` and `as_dict(self) -> dict` methods on the class: ```python from databricks.sdk import WorkspaceClient from databricks.labs.blueprint.installation import Installation class SomePolicy: def __init__(self, a, b): self._a = a self._b = b def as_dict(self) -> dict: return {"a": self._a, "b": self._b} @classmethod def from_dict(cls, raw: dict): return cls(raw.get("a"), raw.get("b")) def __eq__(self, o): assert isinstance(o, SomePolicy) return self._a == o._a and self._b == o._b policy = SomePolicy(1, 2) installation = Installation.current(WorkspaceClient(), "blueprint") installation.save(policy, filename="backups/policy-123.json") load = installation.load(SomePolicy, filename="backups/policy-123.json") assert load == policy ```
Configuration menu - View commit details
-
Copy full SHA for a029f6b - Browse repository at this point
Copy the full SHA a029f6bView commit details -
* Added automated upgrade framework ([#50](#50)). This update introduces an automated upgrade framework for managing and applying upgrades to the product, with a new `upgrades.py` file that includes a `ProductInfo` class having methods for version handling, wheel building, and exception handling. The test code organization has been improved, and new test cases, functions, and a directory structure for fixtures and unit tests have been added for the upgrades functionality. The `test_wheels.py` file now checks the version of the Databricks SDK and handles cases where the version marker is missing or does not contain the `__version__` variable. Additionally, a new `Application State Migrations` section has been added to the README, explaining the process of seamless upgrades from version X to version Z through version Y, addressing the need for configuration or database state migrations as the application evolves. Users can apply these upgrades by following an idiomatic usage pattern involving several classes and functions. Furthermore, improvements have been made to the `_trim_leading_whitespace` function in the `commands.py` file of the `databricks.labs.blueprint` module, ensuring accurate and consistent removal of leading whitespace for each line in the command string, leading to better overall functionality and maintainability. * Added brute-forcing `SerdeError` with `as_dict()` and `from_dict()` ([#58](#58)). This commit introduces a brute-forcing approach for handling `SerdeError` using `as_dict()` and `from_dict()` methods in an open-source library. The new `SomePolicy` class demonstrates the usage of these methods for manual serialization and deserialization of custom classes. The `as_dict()` method returns a dictionary representation of the class instance, and the `from_dict()` method, decorated with `@classmethod`, creates a new instance from the provided dictionary. Additionally, the GitHub Actions workflow for acceptance tests has been updated to include the `ready_for_review` event type, ensuring that tests run not only for opened and synchronized pull requests but also when marked as "ready for review." These changes provide developers with more control over the deserialization process and facilitate debugging in cases where default deserialization fails, but should be used judiciously to avoid brittle code. * Fixed nightly integration tests run as service principals ([#52](#52)). In this release, we have enhanced the compatibility of our codebase with service principals, particularly in the context of nightly integration tests. The `Installation` class in the `databricks.labs.blueprint.installation` module has been refactored, deprecating the `current` method and introducing two new methods: `assume_global` and `assume_user_home`. These methods enable users to install and manage `blueprint` as either a global or user-specific installation. Additionally, the `existing` method has been updated to work with the new `Installation` methods. In the test suite, the `test_installation.py` file has been updated to correctly detect global and user-specific installations when running as a service principal. These changes improve the testability and functionality of our software, ensuring seamless operation with service principals during nightly integration tests. * Made `test_existing_installations_are_detected` more resilient ([#51](#51)). In this release, we have added a new test function `test_existing_installations_are_detected` that checks if existing installations are correctly detected and retries the test for up to 15 seconds if they are not. This improves the reliability of the test by making it more resilient to potential intermittent failures. We have also added an import from `databricks.sdk.retries` named `retried` which is used to retry the test function in case of an `AssertionError`. Additionally, the test function `test_existing` has been renamed to `test_existing_installations_are_detected` and the `xfail` marker has been removed. We have also renamed the test function `test_dataclass` to `test_loading_dataclass_from_installation` for better clarity. This change will help ensure that the library is correctly detecting existing installations and improve the overall quality of the codebase.
Configuration menu - View commit details
-
Copy full SHA for 905e5ff - Browse repository at this point
Copy the full SHA 905e5ffView commit details
Loading
This comparison is taking too long to generate.
Unfortunately it looks like we can’t render this comparison for you right now. It might be too big, or there might be something weird with your repository.
You can try running this command locally to see the comparison on your machine:
git diff v0.2.4...v0.3.0