Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: databrickslabs/blueprint
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.2.4
Choose a base ref
...
head repository: databrickslabs/blueprint
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v0.3.0
Choose a head ref
  • 11 commits
  • 39 files changed
  • 2 contributors

Commits on Feb 13, 2024

  1. Configuration menu
    Copy the full SHA
    e6a08fb View commit details
    Browse the repository at this point in the history
  2. Release v0.2.5 (#44)

    * Automatically enable workspace filesystem if the feature is disabled
    ([#42](#42)).
    nfx authored Feb 13, 2024
    Configuration menu
    Copy the full SHA
    c7e47ab View commit details
    Browse the repository at this point in the history

Commits on Feb 20, 2024

  1. Use databrickslabs/sandbox/acceptance action (#45)

    This PR onboards this repo for a new experimental github action
    nfx authored Feb 20, 2024
    Configuration menu
    Copy the full SHA
    22fc1a8 View commit details
    Browse the repository at this point in the history

Commits on Feb 21, 2024

  1. Configuration menu
    Copy the full SHA
    9cbc6f8 View commit details
    Browse the repository at this point in the history

Commits on Feb 23, 2024

  1. Made test_existing_installations_are_detected more resilient (#51)

    Retry for 15 seconds if both installation are not detected
    nfx authored Feb 23, 2024
    Configuration menu
    Copy the full SHA
    aa57141 View commit details
    Browse the repository at this point in the history

Commits on Feb 24, 2024

  1. Configuration menu
    Copy the full SHA
    01d9467 View commit details
    Browse the repository at this point in the history

Commits on Mar 1, 2024

  1. Configuration menu
    Copy the full SHA
    24e62ef View commit details
    Browse the repository at this point in the history

Commits on Mar 2, 2024

  1. Added automated upgrade framework (#50)

    This release introduces several improvements and additions to the
    databricks.labs.blueprint module. It includes an automated upgrade
    framework with a new `upgrades.py` file, providing mechanisms for
    managing and applying upgrades to the product. The `ProductInfo` class
    now includes a `checkout_root` method, a `as_semver` property, a
    `wheels` property, and a `_infer_version_file` class method for better
    handling of versioning and wheel building processes. A new exception,
    `SingleSourceVersionError`, has been added to indicate that the version
    file is not found in the given project. Additionally, test code
    organization has been improved with new directories and files for
    fixtures and unit tests, along with new test cases and functions for the
    upgrades functionality. The `test_wheels.py` file has also been updated
    with new functions to check the version of the Databricks SDK and handle
    cases where the version marker is missing or does not contain the
    `__version__` variable. These changes enhance the functionality,
    maintainability, and adaptability of the system.
    nfx authored Mar 2, 2024
    Configuration menu
    Copy the full SHA
    c8a74f4 View commit details
    Browse the repository at this point in the history
  2. Added brute-forcing SerdeError with as_dict() and from_dict() (#58

    )
    
    In the rare circumstances when you cannot use
    [@DataClass](#loading-dataclass-configuration) or you get `SerdeError`
    that you cannot explain, you can implement `from_dict(cls, raw: dict) ->
    'T'` and `as_dict(self) -> dict` methods on the class:
    
    ```python
    from databricks.sdk import WorkspaceClient
    from databricks.labs.blueprint.installation import Installation
    
    class SomePolicy:
        def __init__(self, a, b):
            self._a = a
            self._b = b
    
        def as_dict(self) -> dict:
            return {"a": self._a, "b": self._b}
    
        @classmethod
        def from_dict(cls, raw: dict):
            return cls(raw.get("a"), raw.get("b"))
    
        def __eq__(self, o):
            assert isinstance(o, SomePolicy)
            return self._a == o._a and self._b == o._b
    
    policy = SomePolicy(1, 2)
    installation = Installation.current(WorkspaceClient(), "blueprint")
    installation.save(policy, filename="backups/policy-123.json")
    load = installation.load(SomePolicy, filename="backups/policy-123.json")
    
    assert load == policy
    ```
    nfx authored Mar 2, 2024
    Configuration menu
    Copy the full SHA
    a029f6b View commit details
    Browse the repository at this point in the history
  3. Release v0.3.0 (#59)

    * Added automated upgrade framework
    ([#50](#50)). This
    update introduces an automated upgrade framework for managing and
    applying upgrades to the product, with a new `upgrades.py` file that
    includes a `ProductInfo` class having methods for version handling,
    wheel building, and exception handling. The test code organization has
    been improved, and new test cases, functions, and a directory structure
    for fixtures and unit tests have been added for the upgrades
    functionality. The `test_wheels.py` file now checks the version of the
    Databricks SDK and handles cases where the version marker is missing or
    does not contain the `__version__` variable. Additionally, a new
    `Application State Migrations` section has been added to the README,
    explaining the process of seamless upgrades from version X to version Z
    through version Y, addressing the need for configuration or database
    state migrations as the application evolves. Users can apply these
    upgrades by following an idiomatic usage pattern involving several
    classes and functions. Furthermore, improvements have been made to the
    `_trim_leading_whitespace` function in the `commands.py` file of the
    `databricks.labs.blueprint` module, ensuring accurate and consistent
    removal of leading whitespace for each line in the command string,
    leading to better overall functionality and maintainability.
    * Added brute-forcing `SerdeError` with `as_dict()` and `from_dict()`
    ([#58](#58)). This
    commit introduces a brute-forcing approach for handling `SerdeError`
    using `as_dict()` and `from_dict()` methods in an open-source library.
    The new `SomePolicy` class demonstrates the usage of these methods for
    manual serialization and deserialization of custom classes. The
    `as_dict()` method returns a dictionary representation of the class
    instance, and the `from_dict()` method, decorated with `@classmethod`,
    creates a new instance from the provided dictionary. Additionally, the
    GitHub Actions workflow for acceptance tests has been updated to include
    the `ready_for_review` event type, ensuring that tests run not only for
    opened and synchronized pull requests but also when marked as "ready for
    review." These changes provide developers with more control over the
    deserialization process and facilitate debugging in cases where default
    deserialization fails, but should be used judiciously to avoid brittle
    code.
    * Fixed nightly integration tests run as service principals
    ([#52](#52)). In this
    release, we have enhanced the compatibility of our codebase with service
    principals, particularly in the context of nightly integration tests.
    The `Installation` class in the `databricks.labs.blueprint.installation`
    module has been refactored, deprecating the `current` method and
    introducing two new methods: `assume_global` and `assume_user_home`.
    These methods enable users to install and manage `blueprint` as either a
    global or user-specific installation. Additionally, the `existing`
    method has been updated to work with the new `Installation` methods. In
    the test suite, the `test_installation.py` file has been updated to
    correctly detect global and user-specific installations when running as
    a service principal. These changes improve the testability and
    functionality of our software, ensuring seamless operation with service
    principals during nightly integration tests.
    * Made `test_existing_installations_are_detected` more resilient
    ([#51](#51)). In this
    release, we have added a new test function
    `test_existing_installations_are_detected` that checks if existing
    installations are correctly detected and retries the test for up to 15
    seconds if they are not. This improves the reliability of the test by
    making it more resilient to potential intermittent failures. We have
    also added an import from `databricks.sdk.retries` named `retried` which
    is used to retry the test function in case of an `AssertionError`.
    Additionally, the test function `test_existing` has been renamed to
    `test_existing_installations_are_detected` and the `xfail` marker has
    been removed. We have also renamed the test function `test_dataclass` to
    `test_loading_dataclass_from_installation` for better clarity. This
    change will help ensure that the library is correctly detecting existing
    installations and improve the overall quality of the codebase.
    nfx authored Mar 2, 2024
    Configuration menu
    Copy the full SHA
    905e5ff View commit details
    Browse the repository at this point in the history
Loading