Whee!

Feb 19, 2005 16:21

Paper is submitted to WADS. Eight pages, 4 code listings, no figures, 2 references ( Read more... )

paper

Leave a comment

Comments 4

codefreez February 20 2005, 19:11:24 UTC
Isn't Big-O always worst case?

Reply

chouyu_31 February 20 2005, 21:16:13 UTC
Yes and no. We teach that it is always worst-case, because it is less confusing for everyone. One can also use Big-O as an 'average case', or O(n) randomized, or even as a general order notation like, "it is in the realm of O(n^2)". One can also use Big-O for things we don't know are exact, because according to the definition, our O(n) worst-case algorithm is also O(n^2). How can that be?

All that is required is that FCN/ALG >= c as n->inf, for some function FCN inside the O(...), the true running time of the algorithm ALG, and some constant c. Set FCN to n^2, ALG to n, and c to 1, and the relation will be true. It won't be asymptotically tight, but it will be true.

We use "O(n) time in the worst-case" because it is explicit. We could have technically said that our algorithm was Theta(n). That is, it runs in O(n) time and Omega(n) time, but not many people are interested in the best-case (which just happens to match our worst-case), so ommitting the Omega, and the resulting Theta, gives people what they expect.

Clear as mud

Reply

utilitygeek February 21 2005, 23:02:27 UTC
Heh.

I always thought of it as "Big-O, Middle-O, and Little-O". Don't tell Molran the Magnificent, though.

Reply

chouyu_31 February 22 2005, 01:50:54 UTC
There's actually 5:
O, o, Theta, Omega and little-omega. Though little-omega is basically worthless.

Reply


Leave a comment

Up