I'm trying to convert Binary and stuff with Swift and this is my code :
let hexa = String(Int(a, radix: 2)!, radix: 16)// Converting binary to hexadecimal
Cannot convert value type of 'Int' to expected argument type 'String'
You're misunderstanding how integers are stored.
There is no notion of a "decimal"
Int, a "hexadecimal"
Int, etc. When you have an
Int in memory, it's always binary (radix 2). It's stored as a series of 64 or 32 bits.
When you try to assign to the
Int a value like
0b1010 (binary), the compiler does the necessary parsing to convert your source code's string representation of that
Int, into a series of bits that can be stored in the
Int's 64 or 32 bits of memory.
When you try to use the
Int, for example with
print(a), there is conversion behind the scenes to take that
Int's binary representation in memory, and convert it into a
String whose symbols represent an
Int in base 10, using the symbols we're used to (
On a more fundamental, it helps to understand that the notion of a radix is a construct devised purely for our convenience when working with numbers. Abstractly, a number has a magnitude that is a distinct entity, uncoupled from any radix. A magnitude can be represented concretely using a textual representation and a radix.
Int(a, radix: 2), doesn't make sense. Even supposing such an initializer (
Int.init?(Int, radix: Int)) existed, it wouldn't do anything!. If
a = 5, then
a is stored as binary
0b101. This would then be parsed from binary into an Int, giving you...
0b101, or the same
5 you started with.
On the other hand,
Strings can have a notion of a radix, because they can be a textual representation of a decimal
Int, a hex
Int, etc. To convert from a
String that contains a number, you use
Int.init?(String, radix: Int). The key here is that it takes a
let a = 10 //decimal 10 is stored as binary in memory 1010 let hexa = String(a, radix: 16) //the Int is converted to a string, 0xA