samthebest samthebest - 1 month ago 10
Scala Question

How to suppress Spark logging in unit tests?

So thanks to easily googleable blogs I tried:

import org.specs2.mutable.Specification

class SparkEngineSpecs extends Specification {

def setLogLevels(level: Level, loggers: Seq[String]): Map[String, Level] = => {
val logger = Logger.getLogger(loggerName)
val prevLevel = logger.getLevel
loggerName -> prevLevel

setLogLevels(Level.WARN, Seq("spark", "org.eclipse.jetty", "akka"))

val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine"))

// ... my unit tests

But unfortunately it doesn't work, I still get a lot of spark output, e.g.:

14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216)
14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4
14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4
14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4


Add the following code into the file inside the src/test/resources dir, create the file/dir if not exist

# Change this to set Spark log level

# Silence akka remoting

# Ignore messages below warning level from Jetty, because it's a bit verbose

When I run my unit tests (I'm using JUnit and Maven), I only receive WARN level logs, in other words no more cluttering with INFO level logs (though they can be useful at times for debugging).

I hope this helps.