Skip to content
This repository was archived by the owner on Nov 15, 2024. It is now read-only.

Commit 6782359

Browse files
BounkongKMarcelo Vanzin
authored andcommitted
[SPARK-23941][MESOS] Mesos task failed on specific spark app name
## What changes were proposed in this pull request? Shell escaped the name passed to spark-submit and change how conf attributes are shell escaped. ## How was this patch tested? This test has been tested manually with Hive-on-spark with mesos or with the use case described in the issue with the sparkPi application with a custom name which contains illegal shell characters. With this PR, hive-on-spark on mesos works like a charm with hive 3.0.0-SNAPSHOT. I state that this contribution is my original work and that I license the work to the project under the project’s open source license Author: Bounkong Khamphousone <[email protected]> Closes apache#21014 from tiboun/fix/SPARK-23941.
1 parent 7bbec0d commit 6782359

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterScheduler.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -530,9 +530,9 @@ private[spark] class MesosClusterScheduler(
530530
.filter { case (key, _) => !replicatedOptionsBlacklist.contains(key) }
531531
.toMap
532532
(defaultConf ++ driverConf).foreach { case (key, value) =>
533-
options ++= Seq("--conf", s""""$key=${shellEscape(value)}"""".stripMargin) }
533+
options ++= Seq("--conf", s"${key}=${value}") }
534534

535-
options
535+
options.map(shellEscape)
536536
}
537537

538538
/**

0 commit comments

Comments
 (0)