Mason Walton - 2 months ago 10

Java Question

Suppose an algorithm that processes an array is O (log n). The

algorithm takes at most 12 µs to process an array of size 25. All

conditions being equal, approximately, at most how long will the

algorithm take to process an array of size 625?

Would I solve this by dividing 625/25 = 25 and then multiplying that by the 12 microseconds it takes per 25 elements ((625/25)*12 = 300 microseconds), or is there more to it? For example, would i need to calculate the max number of comparisons using log2(625) + 1 and use that in the calculation? Any help is appreciated.

Answer

If the algorithm takes O(log *n*) time then it takes *C* log *n* + *f*(*n*) time to process *n* elements. Here *C* is some constant factor and *f*(*n*) is some function which grows slower than O(log *n*).

The worst case for scaling purposes is when the *f*(*n*) term contributes nothing—i.e., when *f*(*n*) = 0—so let's forget about that term. We'll just consider *C* log *n*.

We know that

Clog 25 = 12µs

Therefore

C= 12µs / log 25

Now if we plug in 625, we get:

Clogn= (12µs / log 25) log 625 =24µs

(The presumption in my answer is that *f*(*n*) is always non-negative and is monotonically increasing. That's not a mathematical requirement of Big-O notation, but in practice it's a reasonable restriction.)