Sagi Sagi - 2 months ago 17
C Question

Convert integer from (pure) binary to BCD

I'm to stupid right now to solve this problem...

I get a BCD number (every digit is an own 4Bit representation)

For example, what I want:


  • Input: 202 (hex) == 514 (dec)

  • Output: BCD 0x415

  • Input: 0x202

  • Bit-representation: 0010 0000 0010 = 514



What have I tried:

unsigned int uiValue = 0x202;
unsigned int uiResult = 0;
unsigned int uiMultiplier = 1;
unsigned int uiDigit = 0;


// get the dec bcd value
while ( uiValue > 0 )
{
uiDigit= uiValue & 0x0F;
uiValue >>= 4;
uiResult += uiMultiplier * uiDigit;
uiMultiplier *= 10;
}


But I know that's very wrong this would be 202 in Bit representation and then split into 5 nibbles and then represented as decimal number again

I can solve the problem on paper but I just cant get it in a simple C-Code

Answer

You got it the wrong way round. Your code is converting from BCD to binary, just as your question's (original) title says. But the input and output values you provided are correct only if you convert from binary to BCD. In that case, try:

while (uiValue > 0) {
   uiResult <<= 4;
   uiResult |= uiValue % 10;
   uiValue /= 10;
}

Proof: http://ideone.com/R0reQh

Comments