Fix databricks CLI authentication on Windows#192
Merged
Conversation
tanmay-db
approved these changes
Nov 24, 2023
Contributor
tanmay-db
left a comment
There was a problem hiding this comment.
Thanks for the fix. LGTM
| LOG.debug("Found executable named '" + name + "' in PATH: " + pathStr); | ||
| return pathStr; | ||
| } | ||
| LOG.debug("Failed to find executable named '" + name + "' in PATH"); |
Contributor
There was a problem hiding this comment.
Since we are mentioning failed in the message -- this probably should be LOG.warn
Contributor
Author
There was a problem hiding this comment.
I don't think this is an indication that there is a problem though. The user may legitimately not have installed the CLI in their system (e.g. on a webserver).
| */ | ||
| public static void main(String[] args) { | ||
| DatabricksConfig config = getConfig(); | ||
| // DatabricksConfig config = getConfig(); |
Contributor
There was a problem hiding this comment.
Should we remove this comment from demo?
Contributor
Author
There was a problem hiding this comment.
Right, thanks for the reminder, I was going to undo this change
7bec2dc to
e5fd5d9
Compare
mgyucht
added a commit
that referenced
this pull request
Nov 29, 2023
Bug fixes: * Fix databricks CLI authentication on Windows ([#192](#192)). * Fix SCIM pagination ([#193](#193)). Other changes: * Add more detailed error message on default credentials not found error ([#180](#180)). * Support custom scopes and redirectUrl for U2M OAuth flow ([#190](#190)). API Changes: * Removed `enableOptimization()` method for `workspaceClient.metastores()` service. * Added `pipelineId` field for `com.databricks.sdk.service.catalog.TableInfo`. * Added `enablePredictiveOptimization` field for `com.databricks.sdk.service.catalog.UpdateCatalog` and `com.databricks.sdk.service.catalog.UpdateSchema`. * Removed `com.databricks.sdk.service.catalog.UpdatePredictiveOptimization` and `com.databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` class. * Added `description` field for `com.databricks.sdk.service.jobs.CreateJob` and `com.databricks.sdk.service.jobs.JobSettings`. * Added `listNetworkConnectivityConfigurations()` and `listPrivateEndpointRules()` methods for `accountClient.networkConnectivity()` service. * Added `com.databricks.sdk.service.settings.ListNccAzurePrivateEndpointRulesResponse`, `com.databricks.sdk.service.settings.ListNetworkConnectivityConfigurationsRequest`, `com.databricks.sdk.service.settings.ListNetworkConnectivityConfigurationsResponse`, and `com.databricks.sdk.service.settings.ListPrivateEndpointRulesRequest` classes. * Added `stringSharedAs` field for `com.databricks.sdk.service.sharing.SharedDataObject`. OpenAPI SHA: 22f09783eb8a84d52026f856be3b2068f9498db3, Date: 2023-11-23 Dependency updates: * Bump API spec: 23 Nov 2023 ([#191](#191)).
Merged
github-merge-queue bot
pushed a commit
that referenced
this pull request
Nov 29, 2023
Bug fixes: * Fix databricks CLI authentication on Windows ([#192](#192)). * Fix SCIM pagination ([#193](#193)). Other changes: * Add more detailed error message on default credentials not found error ([#180](#180)). * Support custom scopes and redirectUrl for U2M OAuth flow ([#190](#190)). API Changes: * Removed `enableOptimization()` method for `workspaceClient.metastores()` service. * Added `pipelineId` field for `com.databricks.sdk.service.catalog.TableInfo`. * Added `enablePredictiveOptimization` field for `com.databricks.sdk.service.catalog.UpdateCatalog` and `com.databricks.sdk.service.catalog.UpdateSchema`. * Removed `com.databricks.sdk.service.catalog.UpdatePredictiveOptimization` and `com.databricks.sdk.service.catalog.UpdatePredictiveOptimizationResponse` class. * Added `description` field for `com.databricks.sdk.service.jobs.CreateJob` and `com.databricks.sdk.service.jobs.JobSettings`. * Added `listNetworkConnectivityConfigurations()` and `listPrivateEndpointRules()` methods for `accountClient.networkConnectivity()` service. * Added `com.databricks.sdk.service.settings.ListNccAzurePrivateEndpointRulesResponse`, `com.databricks.sdk.service.settings.ListNetworkConnectivityConfigurationsRequest`, `com.databricks.sdk.service.settings.ListNetworkConnectivityConfigurationsResponse`, and `com.databricks.sdk.service.settings.ListPrivateEndpointRulesRequest` classes. * Added `stringSharedAs` field for `com.databricks.sdk.service.sharing.SharedDataObject`. OpenAPI SHA: 22f09783eb8a84d52026f856be3b2068f9498db3, Date: 2023-11-23
This was referenced Dec 1, 2023
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Changes
There are two problems with respect to the Java SDK's auth pathways:
.databrickscfgfile only works on *nix, as it hard-codes the path using forward-slashes. I've rewritten this to usePaths.get.databricks.exe, notdatabricks, so we need to search for the binary with OS-specific filename.To make this change unit-testable, I introduce an
Environmentclass to abstract over environment variables, the search path for the databricks CLI executable, and the operating system name. This is needed because there are both differences in the environment variable name (PATHin Linux/Mac vsPathin Windows) and in the path separator (:in Linux/Mac vs;in Windows). The only system-independent way to get the path seems to beSystem.getenv("PATH")which works even on Windows. Note thatSystem.getenv().get("PATH")doesn't work on Windows. Additionally, determining the OS depends onSystem.getProperty, so this can be initialized directly in DatabricksConfig and mocked in unit tests.Tests
Manually tested on a Windows machine:
.\datarbricks.exe auth login --host <HOST>.As a side note: when we make auth permutations test harnesses, we should have a Windows variant to catch these errors.