Skip to content

Commit e0f32d7

Browse files
Add interactive warehouse picker for templates that make use of SQL warehouses (#4170)
## Summary This adds an interactive warehouse picker + default warehouse support for templates that make use of a warehouse option: <img width="617" height="268" alt="image" src="https://github.com/user-attachments/assets/b82a955d-f10a-4e1d-be93-8d90069146e1" /> Note that this PR doesn't change the interface of existing templates: we keep using `http_path` as the property name for warehouses to maintain compatibility in the workspace and any client tools that depend on that name. The property just has a new `"format": "warehouse_path"` setting that describes the new behavior. ## Test plan - [x] Unit tests pass - [x] Acceptance tests updated and pass - [x] `make lint` and `make fmt` pass - [x] Manual testing of `databricks bundle init` with SQL templates
1 parent 2a3cf48 commit e0f32d7

File tree

12 files changed

+242
-19
lines changed

12 files changed

+242
-19
lines changed

NEXT_CHANGELOG.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@
99

1010
### Bundles
1111

12+
* Add interactive SQL warehouse picker to `default-sql` and `dbt-sql` bundle templates ([#4170](https://github.com/databricks/cli/pull/4170))
13+
1214
### Dependency updates
1315

1416
### API Changes

acceptance/bundle/templates/dbt-sql/output/my_dbt_sql/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -88,12 +88,12 @@ $ dbt test
8888
8989
## Production setup
9090
91-
Your production dbt profiles are defined in dbt_profiles/profiles.yml.
92-
These profiles define the default catalog, schema, and any other
91+
Your production dbt profiles are defined in `dbt_profiles/profiles.yml`.
92+
These profiles define the default warehouse, catalog, schema, and any other
9393
target-specific settings. Read more about dbt profiles on Databricks at
9494
https://docs.databricks.com/en/workflows/jobs/how-to/use-dbt-in-workflows.html#advanced-run-dbt-with-a-custom-profile.
9595
96-
The target workspaces for staging and prod are defined in databricks.yml.
96+
The target workspaces for staging and prod are defined in `databricks.yml`.
9797
You can manually deploy based on these configurations (see below).
9898
Or you can use CI/CD to automate deployment. See
9999
https://docs.databricks.com/dev-tools/bundles/ci-cd.html for documentation
@@ -120,7 +120,7 @@ For example, the default template would deploy a job called
120120
You can find that job by opening your workpace and clicking on **Workflows**.
121121
122122
You can also deploy to your production target directly from the command-line.
123-
The warehouse, catalog, and schema for that target are configured in databricks.yml.
123+
The warehouse, catalog, and schema for that target are configured in `dbt_profiles/profiles.yml`.
124124
When deploying to this target, note that the default job at resources/my_dbt_sql.job.yml
125125
has a schedule set that runs every day. The schedule is paused when deploying in development mode
126126
(see https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).

acceptance/bundle/templates/default-sql/output/my_default_sql/README.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,3 +39,8 @@ The 'my_default_sql' project was generated by using the default-sql template.
3939
7. For documentation on the Databricks Asset Bundles format used
4040
for this project, and for CI/CD configuration, see
4141
https://docs.databricks.com/dev-tools/bundles/index.html.
42+
43+
## Changing the warehouse, catalog, or schema
44+
45+
The default SQL warehouse, catalog, and schema are configured in `databricks.yml`.
46+
To change these settings, edit the `variables` section for each target (dev/prod).

libs/cmdio/io.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ func (c *cmdIO) Select(items []Tuple, label string) (id string, err error) {
114114
StartInSearchMode: true,
115115
Searcher: func(input string, idx int) bool {
116116
lower := strings.ToLower(items[idx].Name)
117-
return strings.Contains(lower, input)
117+
return strings.Contains(lower, strings.ToLower(input))
118118
},
119119
Templates: &promptui.SelectTemplates{
120120
Active: `{{.Name | bold}} ({{.Id|faint}})`,

libs/databrickscfg/cfgpickers/clusters.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -194,7 +194,7 @@ func AskForCluster(ctx context.Context, w *databricks.WorkspaceClient, filters .
194194
Items: compatible,
195195
Searcher: func(input string, idx int) bool {
196196
lower := strings.ToLower(compatible[idx].ClusterName)
197-
return strings.Contains(lower, input)
197+
return strings.Contains(lower, strings.ToLower(input))
198198
},
199199
StartInSearchMode: true,
200200
Templates: &promptui.SelectTemplates{

libs/databrickscfg/cfgpickers/warehouses.go

Lines changed: 174 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,17 @@ import (
44
"context"
55
"errors"
66
"fmt"
7+
"sort"
8+
"strings"
79

810
"github.com/databricks/cli/libs/cmdio"
911
"github.com/databricks/databricks-sdk-go"
12+
"github.com/databricks/databricks-sdk-go/apierr"
13+
"github.com/databricks/databricks-sdk-go/config"
14+
"github.com/databricks/databricks-sdk-go/httpclient"
1015
"github.com/databricks/databricks-sdk-go/service/sql"
1116
"github.com/fatih/color"
17+
"github.com/manifoldco/promptui"
1218
)
1319

1420
var ErrNoCompatibleWarehouses = errors.New("no compatible warehouses")
@@ -63,3 +69,171 @@ func AskForWarehouse(ctx context.Context, w *databricks.WorkspaceClient, filters
6369
}
6470
return cmdio.Select(ctx, names, "Choose SQL Warehouse")
6571
}
72+
73+
// sortWarehousesByState sorts warehouses by state priority (running first), then alphabetically by name.
74+
// Deleted warehouses are filtered out.
75+
func sortWarehousesByState(all []sql.EndpointInfo) []sql.EndpointInfo {
76+
var warehouses []sql.EndpointInfo
77+
for _, wh := range all {
78+
if wh.State != sql.StateDeleted && wh.State != sql.StateDeleting {
79+
warehouses = append(warehouses, wh)
80+
}
81+
}
82+
83+
priorities := map[sql.State]int{
84+
sql.StateRunning: 1,
85+
sql.StateStarting: 2,
86+
sql.StateStopped: 3,
87+
sql.StateStopping: 4,
88+
}
89+
sort.Slice(warehouses, func(i, j int) bool {
90+
pi, pj := priorities[warehouses[i].State], priorities[warehouses[j].State]
91+
if pi != pj {
92+
return pi < pj
93+
}
94+
return strings.ToLower(warehouses[i].Name) < strings.ToLower(warehouses[j].Name)
95+
})
96+
97+
return warehouses
98+
}
99+
100+
// GetDefaultWarehouse returns the default warehouse for the workspace.
101+
// It tries the following in order:
102+
// 1. The "default" warehouse via API (server-side convention, not yet fully rolled out)
103+
// 2. The first usable warehouse sorted by state (running first)
104+
func GetDefaultWarehouse(ctx context.Context, w *databricks.WorkspaceClient) (*sql.EndpointInfo, error) {
105+
// Try the "default" warehouse convention first
106+
// This is a new server-side feature that may not be available everywhere yet
107+
warehouse, err := w.Warehouses.Get(ctx, sql.GetWarehouseRequest{Id: "default"})
108+
if err == nil {
109+
return &sql.EndpointInfo{
110+
Id: warehouse.Id,
111+
Name: warehouse.Name,
112+
State: warehouse.State,
113+
}, nil
114+
}
115+
var apiErr *apierr.APIError
116+
if !errors.As(err, &apiErr) || apiErr.StatusCode >= 500 {
117+
return nil, fmt.Errorf("get default warehouse: %w", err)
118+
}
119+
120+
warehouses, err := listUsableWarehouses(ctx, w)
121+
if err != nil {
122+
return nil, err
123+
}
124+
warehouses = sortWarehousesByState(warehouses)
125+
if len(warehouses) == 0 {
126+
return nil, ErrNoCompatibleWarehouses
127+
}
128+
return &warehouses[0], nil
129+
}
130+
131+
// listUsableWarehouses returns warehouses the user has permission to use.
132+
// This uses the skip_cannot_use=true parameter to filter out inaccessible warehouses.
133+
func listUsableWarehouses(ctx context.Context, w *databricks.WorkspaceClient) ([]sql.EndpointInfo, error) {
134+
// The SDK doesn't expose skip_cannot_use parameter, so we use the raw API
135+
clientCfg, err := config.HTTPClientConfigFromConfig(w.Config)
136+
if err != nil {
137+
return nil, fmt.Errorf("create HTTP client config: %w", err)
138+
}
139+
apiClient := httpclient.NewApiClient(clientCfg)
140+
141+
var response sql.ListWarehousesResponse
142+
err = apiClient.Do(ctx, "GET", "/api/2.0/sql/warehouses?skip_cannot_use=true",
143+
httpclient.WithResponseUnmarshal(&response))
144+
if err != nil {
145+
return nil, fmt.Errorf("list warehouses: %w", err)
146+
}
147+
return response.Warehouses, nil
148+
}
149+
150+
// SelectWarehouse prompts the user to select a SQL warehouse and returns the warehouse ID.
151+
// Warehouses are sorted by state (running first) so the default selection is the best available.
152+
// In non-interactive mode, returns the first (best) warehouse automatically.
153+
// The description parameter is shown before the picker (if non-empty).
154+
func SelectWarehouse(ctx context.Context, w *databricks.WorkspaceClient, description string, filters ...warehouseFilter) (string, error) {
155+
all, err := w.Warehouses.ListAll(ctx, sql.ListWarehousesRequest{})
156+
if err != nil {
157+
return "", fmt.Errorf("list warehouses: %w", err)
158+
}
159+
160+
warehouses := sortWarehousesByState(all)
161+
162+
// Apply filters
163+
var filtered []sql.EndpointInfo
164+
for _, wh := range warehouses {
165+
skip := false
166+
for _, filter := range filters {
167+
if !filter(wh) {
168+
skip = true
169+
break
170+
}
171+
}
172+
if !skip {
173+
filtered = append(filtered, wh)
174+
}
175+
}
176+
warehouses = filtered
177+
178+
if len(warehouses) == 0 {
179+
return "", ErrNoCompatibleWarehouses
180+
}
181+
182+
if len(warehouses) == 1 || !cmdio.IsPromptSupported(ctx) {
183+
return warehouses[0].Id, nil
184+
}
185+
186+
// The first warehouse (sorted by state, then alphabetically) is the default
187+
defaultId := warehouses[0].Id
188+
189+
// Sort by running state first, then alphabetically for display
190+
sort.Slice(warehouses, func(i, j int) bool {
191+
iRunning := warehouses[i].State == sql.StateRunning
192+
jRunning := warehouses[j].State == sql.StateRunning
193+
if iRunning != jRunning {
194+
return iRunning
195+
}
196+
return strings.ToLower(warehouses[i].Name) < strings.ToLower(warehouses[j].Name)
197+
})
198+
199+
// Build options for the picker (● = running, ○ = not running)
200+
var items []cmdio.Tuple
201+
for _, warehouse := range warehouses {
202+
var icon string
203+
if warehouse.State == sql.StateRunning {
204+
icon = color.GreenString("●")
205+
} else {
206+
icon = color.HiBlackString("○")
207+
}
208+
209+
// Show type info in gray
210+
typeInfo := strings.ToLower(string(warehouse.WarehouseType))
211+
if warehouse.EnableServerlessCompute {
212+
typeInfo = "serverless"
213+
}
214+
215+
name := fmt.Sprintf("%s %s %s", icon, warehouse.Name, color.HiBlackString(typeInfo))
216+
if warehouse.Id == defaultId {
217+
name += color.HiBlackString(" [DEFAULT]")
218+
}
219+
items = append(items, cmdio.Tuple{Name: name, Id: warehouse.Id})
220+
}
221+
222+
if description != "" {
223+
cmdio.LogString(ctx, description)
224+
}
225+
promptui.SearchPrompt = "Search: "
226+
warehouseId, err := cmdio.SelectOrdered(ctx, items, "warehouse\n")
227+
if err != nil {
228+
return "", err
229+
}
230+
231+
for _, wh := range warehouses {
232+
if wh.Id == warehouseId {
233+
cmdio.LogString(ctx, fmt.Sprintf("warehouse_id: %s (%s)", warehouseId, wh.Name))
234+
break
235+
}
236+
}
237+
238+
return warehouseId, nil
239+
}

libs/jsonschema/schema.go

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -65,6 +65,9 @@ type Schema struct {
6565
// here: https://github.com/google/re2/wiki/Syntax
6666
Pattern string `json:"pattern,omitempty"`
6767

68+
// Format specifies custom input handling. Supported: "warehouse_path".
69+
Format string `json:"format,omitempty"`
70+
6871
// Extension embeds our custom JSON schema extensions.
6972
Extension
7073

libs/template/config.go

Lines changed: 37 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,9 @@ import (
66
"fmt"
77
"io/fs"
88

9+
"github.com/databricks/cli/libs/cmdctx"
910
"github.com/databricks/cli/libs/cmdio"
11+
"github.com/databricks/cli/libs/databrickscfg/cfgpickers"
1012
"github.com/databricks/cli/libs/jsonschema"
1113
"github.com/databricks/cli/libs/log"
1214
"golang.org/x/exp/maps"
@@ -112,6 +114,17 @@ func (c *config) assignDefaultValues(r *renderer) error {
112114
if _, ok := c.values[name]; ok {
113115
continue
114116
}
117+
118+
// Resolve custom formats (e.g., warehouse_path auto-selects the default warehouse)
119+
if property.Format != "" {
120+
val, err := c.resolveFormat(property.Format, "")
121+
if err != nil {
122+
return err
123+
}
124+
c.values[name] = val
125+
continue
126+
}
127+
115128
// No default value defined for the property
116129
if property.Default == nil {
117130
continue
@@ -133,6 +146,22 @@ func (c *config) assignDefaultValues(r *renderer) error {
133146
return nil
134147
}
135148

149+
// resolveFormat returns a value for custom format types.
150+
// Returns empty string for unknown formats (to fall back to default handling).
151+
func (c *config) resolveFormat(format, description string) (string, error) {
152+
switch format {
153+
case "warehouse_path":
154+
w := cmdctx.WorkspaceClient(c.ctx)
155+
warehouseId, err := cfgpickers.SelectWarehouse(c.ctx, w, description)
156+
if err != nil {
157+
return "", err
158+
}
159+
return "/sql/1.0/warehouses/" + warehouseId, nil
160+
default:
161+
return "", nil
162+
}
163+
}
164+
136165
func (c *config) skipPrompt(p jsonschema.Property, r *renderer) (bool, error) {
137166
// Config already has a value assigned. We don't have to prompt for a user input.
138167
if _, ok := c.values[p.Name]; ok {
@@ -170,7 +199,14 @@ func (c *config) skipPrompt(p jsonschema.Property, r *renderer) (bool, error) {
170199

171200
func (c *config) promptOnce(property *jsonschema.Schema, name, defaultVal, description string) error {
172201
var userInput string
173-
if property.Enum != nil {
202+
203+
if property.Format != "" {
204+
var err error
205+
userInput, err = c.resolveFormat(property.Format, description)
206+
if err != nil {
207+
return err
208+
}
209+
} else if property.Enum != nil {
174210
// List options for the user to select from
175211
options, err := property.EnumStringSlice()
176212
if err != nil {

libs/template/templates/dbt-sql/databricks_template_schema.json

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,10 +11,9 @@
1111
},
1212
"http_path": {
1313
"type": "string",
14-
"pattern": "^/sql/.\\../warehouses/[a-z0-9]+$",
15-
"pattern_match_failure_message": "Path must be of the form /sql/1.0/warehouses/<warehouse id>",
16-
"description": "\nPlease provide the HTTP Path of the SQL warehouse you would like to use with dbt during development.\nYou can find this path by clicking on \"Connection details\" for your SQL warehouse.\nhttp_path [example: /sql/1.0/warehouses/abcdef1234567890]",
17-
"order": 2
14+
"description": "\nSelect a SQL warehouse to use during development (you can change this in databricks.yml later).",
15+
"order": 2,
16+
"format": "warehouse_path"
1817
},
1918
"default_catalog": {
2019
"type": "string",

libs/template/templates/dbt-sql/template/{{.project_name}}/README.md.tmpl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -88,12 +88,12 @@ $ dbt test
8888

8989
## Production setup
9090

91-
Your production dbt profiles are defined in dbt_profiles/profiles.yml.
92-
These profiles define the default catalog, schema, and any other
91+
Your production dbt profiles are defined in `dbt_profiles/profiles.yml`.
92+
These profiles define the default warehouse, catalog, schema, and any other
9393
target-specific settings. Read more about dbt profiles on Databricks at
9494
https://docs.databricks.com/en/workflows/jobs/how-to/use-dbt-in-workflows.html#advanced-run-dbt-with-a-custom-profile.
9595

96-
The target workspaces for staging and prod are defined in databricks.yml.
96+
The target workspaces for staging and prod are defined in `databricks.yml`.
9797
You can manually deploy based on these configurations (see below).
9898
Or you can use CI/CD to automate deployment. See
9999
https://docs.databricks.com/dev-tools/bundles/ci-cd.html for documentation
@@ -120,7 +120,7 @@ For example, the default template would deploy a job called
120120
You can find that job by opening your workpace and clicking on **Workflows**.
121121

122122
You can also deploy to your production target directly from the command-line.
123-
The warehouse, catalog, and schema for that target are configured in databricks.yml.
123+
The warehouse, catalog, and schema for that target are configured in `dbt_profiles/profiles.yml`.
124124
When deploying to this target, note that the default job at resources/{{.project_name}}.job.yml
125125
has a schedule set that runs every day. The schedule is paused when deploying in development mode
126126
(see https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).

0 commit comments

Comments
 (0)