project_ananthu - 10 months ago 38

C Question

I wrote a function to calculate the number of ones in the binary of a decimal number. If the input is 3 then its binary will be 0011 and it should print the output as two. Because there are two 1 in the binary number.How can i calculates time complexity of this function?

`#include<stdio.h>`

void main()

{

void numberof_1(int n)

{

int i,count=0;

if ((n & 1)== 1)

count=count+1;

printf("%d",count);

for(i=0;i<32;i++)

{

n= n >> 1;

if ((n & 1)==1)

{

count=count+1;

}

}

printf("\nthe number of ones =%d\n",count);

}

numberof_1(10);

}

Answer

When speaking from purely algorithmic point of view, the time complexity of `numberof_1()`

is `O(log(n))`

(where `n`

is the input number).

The number is represented in binary base, and thus have `log_2(n)`

bits representing it. Your algorithm is iterating all these bits.

However, note that to really achieve that `O(logn)`

time complexity, you should add a conditional `break`

when `n==0`

(to avoid redundant iterations on a number which is already 0).

From technical point of view, integers are represented by constant number of bits, and if you refer to this size as constant indeed - the algorithm run time is `O(1)`

, as the number of iterations is limited and there is a hard bound for the number of iterations needed.

Source (Stackoverflow)