I answered this question about the Master Theorem (which is used to calculate the asymptotic behavior of divide-and-conquer algorithms):
Why is the recurrence: T (n) = sqrt(2) *T (n/2) + log(n) a case 1 of master method? The source of confusion (see the comments to my answer) seemed to be that the original poster did not really understand the asymptotic behavior of log(n) and n/log(n). In fact, he went so far as to post a graph "proving" that n^0.99 is larger than n/log(n). However, this is false for large enough numbers (large enough being somewhere around 10^100 in this case.) The logarithm grows more slowly than any positive power of n. As a consequence, n/log(n) grows asymptotically faster than any positive power of n less than 1.
What I realized is that this might be some student's first (and maybe only!) experience with infinitesimals! Normally we think of infinitesimals as numbers "smaller than any positive real number". For example, before epsilon-delta proofs and modern analysis, the differential and integral calculus informally treated dx as an infinitesimal value. But while students are told the "idea" is that dx is a very small value, they are repeatedly cautioned not to treat it as an actual number.
The
surreal numbers (and thus the class of combinatoric Games too) have infinitesimals, but they are far outside the normal curriculum.
So what model is a student supposed to apply when confronted with O(log n) or Θ(log n)? It behaves exactly like an infinitesimal in the family of asymptotic limits. big-O = bounded above, Ω = bounded below, and Θ = bounded both above and below, thus:
log n = Ω( 1 )
log n = O( n^c ) for any c > 0, i.e., log n = O(n), log n = O(sqrt(n)), log n = O(n^0.0001)
n^k log n = O( n^c ) for any c > k.
n / log n = Ω( n^c ) for any c < 1
Nor does taking a power of the logarithm make any difference.
log^100 n = (log n)^100 = O( n^c ) for any c > 0
Crazy, right? But we can get Wolfram Alpha to calculate the crossover point, say,
when does (log n)^100 = n^0.01? At about 4.3*10^50669.
That's what makes logarithm an infinitesimal. No matter what power we raise it to (think "multiply") it is still smaller in its asymptotic behavior than the smallest power of n (think "any positive real number.") And there's a whole family of infinitesimals hiding here. log log n is infinitesimally smaller than log n. Don't even ask about
the inverse Ackermann function or the iterated-log function.
So it's not surprising that students might have difficulty if this is their first journey outside the real numbers. Everybody can handle the fact that O(n^2) and O(n^0.99) and O(n^0.5) are different, but the fact that none of these examples will be O(log n) is kind of perplexing, because O(log n) is obviously bigger than O(n^0). (Jumping to exponentials like O(2^n) seems like an easier stretch.)
What was your first encounter with an infinitesimal?