From d3eb99f79e508d62fdb7e9bc595f0240ac021df5 Mon Sep 17 00:00:00 2001 From: Josh Rosen Date: Fri, 13 Sep 2024 15:07:25 -0700 Subject: [PATCH] [SPARK-49647][TESTS] Change SharedSparkContext so that its SparkConf loads defaults ### What changes were proposed in this pull request? This PR modifies the `SharedSparkContext` text suite mixin trait so that it instantiates SparkConf as `new SparkConf()` instead of `new SparkConf(loadDefaults = false)`. ### Why are the changes needed? Spark's SBT and Maven builds configure certain test default configurations using system properties, including disabling the Spark UI and lowering Derby metastore durability: https://github.com/apache/spark/blob/08a26bb56cfb48f27c68a79be1e15bc4c9e466e0/project/SparkBuild.scala#L1616-L1633 Most test suites pick up defaults set at this layer. However, the `SharedSparkContext` trait was using `new SparkConf(false)` which bypasses the loading of these defaults. As a result, tests which used this trait don't pick up default configurations and instead try to launch the Spark UI, which may lead to test flakiness due to port binding conflicts. This PR proposes to change this to `new SparkConf()` to address this issue. This change brings `SharedSparkContext` to parity with `SharedSparkSession`, which was already running with `loadDefaults = true`. ### Does this PR introduce _any_ user-facing change? This is a test-only change, so it's not expected to impact users of Apache Spark itself. However, it might possibly impact third-party developers who directly depend on Apache Spark's own test JARs. ### How was this patch tested? Running existing tests. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #48111 from JoshRosen/SPARK-49647-pick-up-sys-prop-defaults-in-SharedSparkContext-test-mixin. Authored-by: Josh Rosen Signed-off-by: Dongjoon Hyun --- .../src/test/scala/org/apache/spark/SharedSparkContext.scala | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/core/src/test/scala/org/apache/spark/SharedSparkContext.scala b/core/src/test/scala/org/apache/spark/SharedSparkContext.scala index 7106a780b3256..22c6280198c9a 100644 --- a/core/src/test/scala/org/apache/spark/SharedSparkContext.scala +++ b/core/src/test/scala/org/apache/spark/SharedSparkContext.scala @@ -27,7 +27,10 @@ trait SharedSparkContext extends BeforeAndAfterAll with BeforeAndAfterEach { sel def sc: SparkContext = _sc - val conf = new SparkConf(false) + // SPARK-49647: use `SparkConf()` instead of `SparkConf(false)` because we want to + // load defaults from system properties and the classpath, including default test + // settings specified in the SBT and Maven build definitions. + val conf: SparkConf = new SparkConf() /** * Initialize the [[SparkContext]]. Generally, this is just called from beforeAll; however, in