Yes and no. We teach that it is always worst-case, because it is less confusing for everyone. One can also use Big-O as an 'average case', or O(n) randomized, or even as a general order notation like, "it is in the realm of O(n^2)". One can also use Big-O for things we don't know are exact, because according to the definition, our O(n) worst-case algorithm is also O(n^2). How can that be?
All that is required is that FCN/ALG >= c as n->inf, for some function FCN inside the O(...), the true running time of the algorithm ALG, and some constant c. Set FCN to n^2, ALG to n, and c to 1, and the relation will be true. It won't be asymptotically tight, but it will be true.
We use "O(n) time in the worst-case" because it is explicit. We could have technically said that our algorithm was Theta(n). That is, it runs in O(n) time and Omega(n) time, but not many people are interested in the best-case (which just happens to match our worst-case), so ommitting the Omega, and the resulting Theta, gives people what they expect.
Comments 4
Reply
All that is required is that FCN/ALG >= c as n->inf, for some function FCN inside the O(...), the true running time of the algorithm ALG, and some constant c. Set FCN to n^2, ALG to n, and c to 1, and the relation will be true. It won't be asymptotically tight, but it will be true.
We use "O(n) time in the worst-case" because it is explicit. We could have technically said that our algorithm was Theta(n). That is, it runs in O(n) time and Omega(n) time, but not many people are interested in the best-case (which just happens to match our worst-case), so ommitting the Omega, and the resulting Theta, gives people what they expect.
Clear as mud
Reply
I always thought of it as "Big-O, Middle-O, and Little-O". Don't tell Molran the Magnificent, though.
Reply
O, o, Theta, Omega and little-omega. Though little-omega is basically worthless.
Reply
Leave a comment