I wrote a function to calculate the number of ones in the binary of a decimal number. If the input is 3 then its binary will be 0011 and it should print the output as two. Because there are two 1 in the binary number.How can i calculates time complexity of this function?
void numberof_1(int n)
if ((n & 1)== 1)
n= n >> 1;
if ((n & 1)==1)
printf("\nthe number of ones =%d\n",count);
When speaking from purely algorithmic point of view, the time complexity of
n is the input number).
The number is represented in binary base, and thus have
log_2(n) bits representing it. Your algorithm is iterating all these bits.
However, note that to really achieve that
O(logn) time complexity, you should add a conditional
n==0 (to avoid redundant iterations on a number which is already 0).
From technical point of view, integers are represented by constant number of bits, and if you refer to this size as constant indeed - the algorithm run time is
O(1), as the number of iterations is limited and there is a hard bound for the number of iterations needed.