Skip to content

[Fix] Support for Spark versions with multiple Scala versions#1331

Closed
alexott wants to merge 6 commits intodatabricks:mainfrom
alexott:tf-issue-5218
Closed

[Fix] Support for Spark versions with multiple Scala versions#1331
alexott wants to merge 6 commits intodatabricks:mainfrom
alexott:tf-issue-5218

Conversation

@alexott
Copy link
Copy Markdown
Contributor

@alexott alexott commented Nov 14, 2025

What changes are proposed in this pull request?

This is a fix for databricks/terraform-provider-databricks#5218. The issue is cased by the fact that Scala selector had default value of 2.12 (for Terraform only), and as result was filtering out 17.x versions that are released only with Scala 2.13.

Fix is to relax the default value, but handle the case when there could be 2.12 and 2.13 versions for the same DBR version (16.x only).

Removed all Terraform customizations from the Go SDK struct, will be handled separately in Terraform itself

Also bumped staticcheck to 0.6.1 to support compilation with Go 1.25

How is this tested?

Added unit test, tested manually

This is a fix for databricks/terraform-provider-databricks#5218.
The issue is cased by the fact that `Scala` selector had default value of `2.12` (for
Terraform only), and as result was filtering out 17.x versions that are released only with
Scala 2.13.

Fix is to relax the default value, but handle the case when there could be 2.12 and 2.13
versions for the same DBR version (16.x only).

Also bumped staticcheck to 0.6.1 to support compilation with Go 1.25

Signed-off-by: Alex Ott <[email protected]>
alexott added a commit to databricks/terraform-provider-databricks that referenced this pull request Nov 14, 2025
Move defaults for `databricks_spark_version` data source from Go SDK to the
Terraform. First part of #5218 work - after new Go SDK is merged, we'll need to change
Scala default to `2.1`.

Related to databricks/databricks-sdk-go#1331
github-merge-queue bot pushed a commit to databricks/terraform-provider-databricks that referenced this pull request Nov 26, 2025
## Changes
<!-- Summary of your changes that are easy to understand -->

Move defaults for `databricks_spark_version` data source from Go SDK to
the Terraform. First part of #5218 work - after new Go SDK is merged,
we'll need to change Scala default to `2.1`.

Related to databricks/databricks-sdk-go#1331

## Tests
<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [ ] relevant change in `docs/` folder
- [x] covered with integration tests in `internal/acceptance`
- [x] using Go SDK
- [x] using TF Plugin Framework
- [x] has entry in `NEXT_CHANGELOG.md` file
@alexott alexott requested a review from tanmay-db November 28, 2025 17:35
@alexott
Copy link
Copy Markdown
Contributor Author

alexott commented Dec 1, 2025

@tanmay-db, I don't have permissions to merge in this repo. Can you please merge it? Thank you

@github-actions
Copy link
Copy Markdown

github-actions bot commented Dec 1, 2025

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/sdk-go

Inputs:

  • PR number: 1331
  • Commit SHA: 8ffa42285f236f9d2f4ab7dd8f1a461f13553b12

Checks will be approved automatically on success.

@tanmay-db tanmay-db enabled auto-merge December 1, 2025 10:37
@rauchy rauchy disabled auto-merge December 2, 2025 15:39
@rauchy
Copy link
Copy Markdown
Contributor

rauchy commented Dec 3, 2025

Closed in favor of #1345 (a duplicate of this PR)

@rauchy rauchy closed this Dec 3, 2025
deco-sdk-tagging bot added a commit that referenced this pull request Dec 3, 2025
## Release v0.93.0

### Bug Fixes

* Support for Spark versions with multiple Scala versions ([#1331](#1331)).

### API Changes
* Add [w.WorkspaceEntityTagAssignments](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/tags#WorkspaceEntityTagAssignmentsAPI) workspace-level service.
* Add `DatasetCatalog` and `DatasetSchema` fields for [dashboards.CreateDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateDashboardRequest).
* Add `DatasetCatalog` and `DatasetSchema` fields for [dashboards.UpdateDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateDashboardRequest).
* Add `PurgeData` field for [database.DeleteSyncedDatabaseTableRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/database#DeleteSyncedDatabaseTableRequest).
* Add `Truncation` field for [pipelines.PipelineEvent](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineEvent).
* Add `ForeignTable` and `Volume` enum values for [sharing.SharedDataObjectDataObjectType](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#SharedDataObjectDataObjectType).
hectorcast-db added a commit that referenced this pull request Mar 18, 2026
… hosts

Remove WorkspaceID check in ConfigType() to match Python SDK PR #1331,
where account hosts always return AccountConfig regardless of WorkspaceID.

Co-authored-by: Isaac
Signed-off-by: Hector Castejon Diaz <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants