Copy this object
Copy this object
Does the configuration contain a given parameter?
Get a parameter, falling back to a default if not set
Get a parameter; throws a NoSuchElementException if it's not set
Get all akka conf variables set on this SparkConf
Get all parameters as a list of pairs
Get a parameter as a boolean, falling back to a default if not set
Get a parameter as a double, falling back to a default if not set
Get all executor environment variables set on this SparkConf
Get a parameter as an integer, falling back to a default if not set
Get a parameter as a long, falling back to a default if not set
Get a parameter as an Option
Remove a parameter from the configuration
Set a configuration variable.
Set multiple parameters together
Set a name for your application.
Set a name for your application. Shown in the Spark web UI.
Set multiple environment variables to be used when launching executors.
Set multiple environment variables to be used when launching executors. (Java-friendly version.)
Set multiple environment variables to be used when launching executors.
Set multiple environment variables to be used when launching executors. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set.
Set an environment variable to be used when launching executors for this application.
Set an environment variable to be used when launching executors for this application. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set.
Set a parameter if it isn't already configured
Set JAR files to distribute to the cluster.
Set JAR files to distribute to the cluster. (Java-friendly version.)
Set JAR files to distribute to the cluster.
The master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.
Set the location where Spark is installed on worker nodes.
Return a string listing all keys and values, one per line.
Return a string listing all keys and values, one per line. This is useful to print the configuration out for debugging.
Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.
Most of the time, you would create a SparkConf object with
new SparkConf()
, which will load values from anyspark.*
Java system properties set in your application as well. In this case, parameters you set directly on theSparkConf
object take priority over system properties.For unit tests, you can also call
new SparkConf(false)
to skip loading external settings and get the same configuration no matter what the system properties are.All setter methods in this class support chaining. For example, you can write
new SparkConf().setMaster("local").setAppName("My app")
.Note that once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime.