Knight71 Knight71 - 2 months ago 6x
Scala Question

what does double parameterized function means in scala?

I was going through the test code for spark. While I understand the logic behind the function given below

What does it means and What is the benefit of defining in the below syntax ?

Test Code

def withStreamingContext[R](ssc: StreamingContext)(block: StreamingContext => R): R = {
try {
} finally {
try {
ssc.stop(stopSparkContext = true)
} catch {
case e: Exception =>
logError("Error stopping StreamingContext", e)

why does it has to be defined this way ? why can't it be

def withStreamingContext[R](ssc: StreamingContext,block: StreamingContext => R): R =


Well, it can. Separating arguments into two or more parameter lists is called currying. This way a two-parameter function can be turned into a function that takes one argument and returns a function that takes one argument and returns the result. This is what happened in the code you posted. Every n-parameter function can be seen as n 1-parameter functions (in fact, in Haskell all functions are treated like this).

Note that Scala also has a concept of partially applied functions, which boils down to the same thing. Both PAF and currying allow you to only pass a subset of parameters, thus receiving a function that takes the rest.

For example,

def sum(x: Int, y: Int) = x + y

can be curried and then you could say, for example:

def sum(x: Int)(y: Int) = x + y
def addTwo = sum(2) _ // type of addTwo is Int => Int

which gives you the same function, but with its first parameter applied. Using PAF, it would be

def sum(x: Int, y: Int) = x + y
def addTwo = sum(2, _: Int)