Suppose an algorithm that processes an array is O (log n). The
algorithm takes at most 12 µs to process an array of size 25. All
conditions being equal, approximately, at most how long will the
algorithm take to process an array of size 625?
If the algorithm takes O(log n) time then it takes C log n + f(n) time to process n elements. Here C is some constant factor and f(n) is some function which grows slower than O(log n).
The worst case for scaling purposes is when the f(n) term contributes nothing—i.e., when f(n) = 0—so let's forget about that term. We'll just consider C log n.
We know that
C log 25 = 12µs
C = 12µs / log 25
Now if we plug in 625, we get:
C log n = (12µs / log 25) log 625 = 24µs
(The presumption in my answer is that f(n) is always non-negative and is monotonically increasing. That's not a mathematical requirement of Big-O notation, but in practice it's a reasonable restriction.)