Skip to content

[BUG]: Missing External HMS spark configuration in Cluster Policy #2420

@FastLee

Description

@FastLee

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

Current installation fail to copy some external hms configurations to the newly created cluster policy.
For example "spark_conf. spark.hadoop.hive.metastore.uris"

Expected Behavior

Any configuration starting with "spark_conf. spark.hadoop.hive.metastore" should be copied to the new cluster configuration.

Steps To Reproduce

  1. Create cluster policy that contains "spark_conf. spark.hadoop.hive.metastore.uris" configuration.
  2. Run the installation.
  3. Point to the cluster policy as the cluster policy for external HMS.

Cloud

AWS

Operating System

macOS

Version

latest via Databricks CLI

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions