I'm new to studying swift and iOS and read this in their documentation:
If you combine integer and floating-point literals in an expression, a type of Double will be inferred from the context:
let anotherPi = 3 + 0.14159
// anotherPi is also inferred to be of type Double
The literal value of 3 has no explicit type in and of itself, and so
an appropriate output type of Double is inferred from the presence of
a floating-point literal as part of the addition.
Floating-point literals are inferred to be
Double by default. You can test this in a Playground:
// Swift 3 let a = 0.14159 print(type(of: a)) // Double let b = 1 + a // works let c = Int(1) + a // doesn't work
Example B works because the integer literal
1 can be inferred to be Double (since it is being added to a
Double). Example C doesn't work because 1 is an explicit
Int. This preserves Swift's numerical type safety while still making it easy to perform simple math.