codingtwinky codingtwinky - 2 months ago 43x
Scala Question

How to limit the number of retries on Spark job failure?

We are running a Spark job via

, and I can see that the job will be re-submitted in the case of failure.

How can I stop it from having attempt #2 in case of yarn container failure or whatever the exception be?

enter image description here


There are two settings that control the number of retries (i.e. the maximum number of ApplicationMaster registration attempts with YARN is considered failed and hence the entire Spark application):

(As you can see in YarnRMClient.getMaxRegAttempts) the actual number is the minimum of the configuration settings of YARN and Spark with YARN's being the last resort.