There are some books which state that theta notation is called the average case while others state that theta is not the average case.
If theta is not the average case then what is called the average case in respect with algorithms?
Θ(g(n)) is not the average case, but you can tell what average case performance is.
Θ shows order of growth, you can use
Θ to describe space/time complexity for worst, avarage or best cases. For example Quicksort worst case is
O(n^2), while average case performance is