Musty Musty - 6 months ago 35
Java Question

Big-O (O(n)) and Big-Omega (Ω(n)) time complexity of the Count(A, B, n) algorithm

What is the big-O (O(n)) and Big-Omega (Ω(n)) time complexity of the Count(A, B, n) algorithm below in terms of n

Algorithm Count (A, B, n)

I did the following:
Algorithm Count (A, B, n)

Input: Arrays A and B of size n, where n is even.
A, B store integers. # of operations:
i <- 0 1
sum <- 0 1
while i < n/2 do n/2 + 1
if A[i + n/2] < 0 then n/2 + 1
for j <- i + n/2 to n do n/2 + n/2+1 + … + n = n(n+1)/2
sum <- sum + B[j] n/2 + … + n = n(n+1)/2
i <- i + 1 n/2 + 1
return sum _ _1______


The Algorithm Count runs in Big-O and Big-Omega of O(n^2) and Ω(n^2). This algorithm involves a nested "for" loop. The maximum number of primitive operations executed by the algorithm count is
0.5n^2+ n + 1
in both the worst case and best case.

Answer

I think the worst case is O(n^2), while the best case is Ω(1). The best case is when you have an array of size 2, and A[1] >= 0, in which case the algorithm only has to pass through the loop once. If n can be 0 (i.e. empty arrays), then that's even better.

To illustrate the best case (n = 2, assuming 0 is not acceptable, and A[1] >= 0), let's assume that assignment operation takes constant time, C1.

i <- 0
sum <- 0

takes constant time 2*C1.

while 0 < 2/2 do

takes constant time C2.

    if A[0 + 2/2] < 0 then //evaluates to false when A[1] >= 0

takes constant time C3, and the nested for loop will never get executed.

    i <- i + 1

takes constant time C4. The loop checks invariant:

while 1 < 2/2 do

which takes C2. Then the operation returns:

return sum

which takes C5.

So the operation in the best case scenario where n = 2 and A[1] >= 0 is:

2*C1 + 2*C2 + C3 + C4 + C5 = O(1)

Now, you can argue that it should have been:

2*C1 + (n/2 + 1)*C2 + C3 + C4 + C5 = O(n)

but we already know that at best n = 2, which is constant.

In the case where the best scenario is when n = 0:

2*C1 + C2 + C5 = O(1)
Comments