Configuration keys for individual Spark jobs

MR3 has several configuration keys that can be set for individual DAGs. For example, the user can change the value for the configuration key mr3.am.task.max.failed.attempts when submitting new DAGs. For the list of such configuration keys, see the section on DAG in Configuring MR3.

In Configuring Spark on MR3, we have seen how to adjust MR3 configuration keys by updating spark-defaults.conf. Such configuration keys are passed to DAGAppMaster of MR3 when it starts, which implies that their values are shared by (DAGs originating from) all Spark jobs. On Spark shell and in Spark SQL, the user can also update configuration keys for (all DAGs originating from) a given Spark job in a similar way: with sc.setLocalProperty() on Spark shell and SparkSession.sessionState.conf.setConfString() in Spark SQL. Here is an example of adjusting MR3 configuration keys mr3.am.task.max.failed.attempts and mr3.am.task.concurrent.run.threshold.percent:

# Spark shell
scala> sc.setLocalProperty("spark.mr3.am.task.max.failed.attempts", "5")
scala> sc.setLocalProperty("spark.mr3.am.task.concurrent.run.threshold.percent", "90")

# Spark SQL
val sparkSession: org.apache.spark.sql.SparkSession = ...
sparkSession.sessionState.conf.setConfString("spark.mr3.am.task.max.failed.attempts", "5")
sparkSession.sessionState.conf.setConfString("spark.mr3.am.task.concurrent.run.threshold.percent", "90")