Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[SPARK-47341][SQL] Fix inaccurate documentation of RuntimeConfig#get #48264

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ class ConnectRuntimeConfig private[sql] (client: SparkConnectClient)
}

/** @inheritdoc */
@throws[NoSuchElementException]("if the key is not set")
@throws[NoSuchElementException]("if the key is not set and there is no default value")
def get(key: String): String = getOption(key).getOrElse {
throw new NoSuchElementException(key)
}
Expand Down
13 changes: 9 additions & 4 deletions sql/api/src/main/scala/org/apache/spark/sql/RuntimeConfig.scala
Original file line number Diff line number Diff line change
Expand Up @@ -54,17 +54,20 @@ abstract class RuntimeConfig {
}

/**
* Returns the value of Spark runtime configuration property for the given key.
* Returns the value of Spark runtime configuration property for the given key. If the key is
* not set yet, return `defaultValue` in its [[ConfigEntry]].
Copy link
Member

@HyukjinKwon HyukjinKwon Sep 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think real error is this:

[error] Generating /__w/spark/spark/target/javaunidoc/org/apache/spark/sql/RuntimeConfig.html...
[error] /__w/spark/spark/sql/api/target/java/org/apache/spark/sql/RuntimeConfig.java:13:1:  error: reference not found
[error]    * not set yet, return <code>defaultValue</code> in its {@link ConfigEntry}.
[error]                                                                  ^
[error] /__w/spark/spark/sql/api/target/java/org/apache/spark/sql/RuntimeConfig.java:25:1:  error: reference not found
[error]    * {@link ConfigEntry} is not the desired one.
[error]             ^

Let's just switch [[ConfigEntry]] to something else or remove it. ConfigEntry isn't technically an API so there is no way for the end users to see.

*
* @throws java.util.NoSuchElementException
* if the key is not set and does not have a default value
* @since 2.0.0
*/
@throws[NoSuchElementException]("if the key is not set")
@throws[NoSuchElementException]("if the key is not set and there is no default value")
def get(key: String): String

/**
* Returns the value of Spark runtime configuration property for the given key.
* Returns the value of Spark runtime configuration property for the given key. If the key is
* not set yet, return the given `default`. This is useful when `defaultValue` of its
* [[ConfigEntry]] is not the desired one.
*
* @since 2.0.0
*/
Expand All @@ -78,7 +81,9 @@ abstract class RuntimeConfig {
def getAll: Map[String, String]

/**
* Returns the value of Spark runtime configuration property for the given key.
* Returns the value of Spark runtime configuration property for the given key. If the key is
* not set yet, return `defaultValue` of its [[ConfigEntry]]. If there is no `defaultValue`,
* return `None`.
*
* @since 2.0.0
*/
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ class RuntimeConfigImpl private[sql](val sqlConf: SQLConf = new SQLConf) extends
}

/** @inheritdoc */
@throws[NoSuchElementException]("if the key is not set")
@throws[NoSuchElementException]("if the key is not set and there is no default value")
def get(key: String): String = {
sqlConf.getConfString(key)
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ package org.apache.spark.sql

import org.apache.spark.SparkFunSuite
import org.apache.spark.internal.config
import org.apache.spark.sql.internal.RuntimeConfigImpl
import org.apache.spark.sql.internal.{RuntimeConfigImpl, SQLConf}
import org.apache.spark.sql.internal.SQLConf.CHECKPOINT_LOCATION
import org.apache.spark.sql.internal.StaticSQLConf.GLOBAL_TEMP_DATABASE

Expand Down Expand Up @@ -81,4 +81,24 @@ class RuntimeConfigSuite extends SparkFunSuite {
}
assert(ex.getMessage.contains("Spark config"))
}

test("set and get a config with defaultValue") {
val conf = newConf()
val key = SQLConf.SESSION_LOCAL_TIMEZONE.key
// By default, the value when getting an unset config entry is its defaultValue.
assert(conf.get(key) == SQLConf.SESSION_LOCAL_TIMEZONE.defaultValue.get)
assert(conf.getOption(key).contains(SQLConf.SESSION_LOCAL_TIMEZONE.defaultValue.get))
// Get the unset config entry with a different default value, which should return the given
// default parameter.
assert(conf.get(key, "Europe/Amsterdam") == "Europe/Amsterdam")

// Set a config entry.
conf.set(key, "Europe/Berlin")
// Get the set config entry.
assert(conf.get(key) == "Europe/Berlin")
// Unset the config entry.
conf.unset(key)
// Get the unset config entry, which should return its defaultValue again.
assert(conf.get(key) == SQLConf.SESSION_LOCAL_TIMEZONE.defaultValue.get)
}
}