Skip to content

Artifacts - Jar artifact file source requires absolute path #1156

@NodeJSmith

Description

@NodeJSmith

Describe the issue

When deploying an asset bundle that uses a local jar file, the artifact files source requires an absolute path in order to be recognized.

Configuration

Please provide a minimal reproducible configuration for the issue

bundle:
  name: dbx_pipeline_events

artifacts:
  convert_events:
    path: ./convert_events
    build: "sbt package"
    type: jar
    files:
      # does not work
      #- source: ./convert_events/target/scala-2.12/convert_events-2.0.jar
      # works
      - source: /home/jessica/dbx-pipeline-events/convert_events/target/scala-2.12/convert_events-2.0.jar

run_as:
  user_name: ${workspace.current_user.userName}

targets:
  dev:
    mode: development
    resources:
      jobs:
        dbx_pipeline_events:
          name: dbx_pipeline_events
          tasks:
            - task_key: convert_events
              job_cluster_key: basic_cluster
              libraries:
                - jar: ./convert_events/target/scala-2.12/convert_events-*.jar
              spark_jar_task:
                main_class_name: com.company.convert_events.ConvertEvents

Steps to reproduce the behavior

Please list the steps required to reproduce the issue, for example:

  1. Have an asset bundle configuration that includes a jar library on the local filesystem, listed as a relative path
  2. Have an artifact configured to create that jar file
  3. Have the source value be a relative path to the current repo, matching what is in the task's libraries field
  4. run databricks bundle deploy

Expected Behavior

I would expect the relative path to be found and linked to the library without any issue and without having to list the absolute path. This is my first time doing this with a jar file, but with a Python wheel task you don't even need to list the files/sources, it just works.

I found it especially surprising that even with the source written exactly how it was written in the libraries section of the task it still did not work.

Building convert_events...
Uploading convert_events-2.0.jar...
Uploading bundle files to /Users/<username>/.bundle/dbx_pipeline_events/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!

Actual Behavior

The jar file is built but the file is not uploaded, this less that useful error message is shown instead

Building convert_events...
artifact section is not defined for file at /home/jessica/dbx-pipeline-events/convert_events/target/scala-2.12/convert_events-2.0.jar. Skipping uploading. In order to use the define 'artifacts' section
Uploading bundle files to /Users/<username>/.bundle/dbx_pipeline_events/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!

OS and CLI version

Databricks CLI v0.212.2
Ubuntu 22.04 - WSL2 via Windows 11

Is this a regression?

Not sure

Debug Logs

databricks_cli_jar_issue_redacted_logs.log

Metadata

Metadata

Assignees

Labels

DABsDABs related issuesEnhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions