curiousBiggie - 21 days ago 13

Java Question

I have a very general question about calculating time complexity(Big O notation). when people say that the worst time complexity for QuickSort is O(n^2) (picking the first element of the array to be the pivot every time, and array is inversely sorted), which operation do they account for to get O(n^2)? Do people count the comparisons made by the if/else statements? Or do they only count the total number of swaps it makes? Generally how do you know which "steps" to count to calculate Big O notation.

I know this is a very basic question but I've read almost all the articles on google but still haven't figured it out

Answer

**Worst cases of Quick Sort**

Worst case of Quick Sort is when array is inversely sorted, sorted normally and all elements are equal.

**Understand Big-Oh**

Having said that, let us first understand what Big-Oh of something means.

When we have only and asymptotic upper bound, we use O-notation. For a given function g(n), we denote by O(g(n)) the set of functions,
O(g(n)) = { f(n) : there exist positive c and n_{o},

such that 0<= f(n) <= cg(n) for all n >= n_{o}}

**How do we calculate Big-Oh?**

Big-Oh basically means how program's complexity increases with the input size.

Here is the code:

```
import java.util.*;
class QuickSort
{
static int partition(int A[],int p,int r)
{
int x = A[r];
int i=p-1;
for(int j=p;j<=r-1;j++)
{
if(A[j]<=x)
{
i++;
int t = A[i];
A[i] = A[j];
A[j] = t;
}
}
int temp = A[i+1];
A[i+1] = A[r];
A[r] = temp;
return i+1;
}
static void quickSort(int A[],int p,int r)
{
if(p<r)
{
int q = partition(A,p,r);
quickSort(A,p,q-1);
quickSort(A,q+1,r);
}
}
public static void main(String[] args) {
int A[] = {5,9,2,7,6,3,8,4,1,0};
quickSort(A,0,9);
Arrays.stream(A).forEach(System.out::println);
}
}
```

Take into consideration the following statements:

```
int x = A[r];
int i=p-1;
if(A[j]<=x)
{
i++;
int t = A[i];
A[i] = A[j];
A[j] = t;
}
int temp = A[i+1];
A[i+1] = A[r];
A[r] = temp;
return i+1;
if(p<r)
{
int q = partition(A,p,r);
quickSort(A,p,q-1);
quickSort(A,q+1,r);
}
```

Assuming each statements take a constant time *c*. Let's calculate how many times each block is calculated.

The first block is executed *2c* times.
The second block is executed *5c* times.
The thirst block is executed *4c* times.

We write this as O(1) which implies the number of times statement is executed same number of times even when size of input varies. all *2c, 5c and 4c* all are O(1).

But, when we add the loop over second block

```
for(int j=p;j<=r-1;j++)
{
if(A[j]<=x)
{
i++;
int t = A[i];
A[i] = A[j];
A[j] = t;
}
}
```

It runs for n times (assuming r-p is equal to n, size of the input) i.e., *nO(1)* times i.e., O(n). But this doesn't run n times all the time. Hence, we have average case.

We now established that the partition runs O(n) or O(log n). The last block definetly runs in O(n). Hence the entire complexity is either O(n^{2}) or O(nlog n).

Source (Stackoverflow)

Comments