Hacker News new | past | comments | ask | show | jobs | submit login

> Big-O is an upper bound on the growth of a function. The function being bounded can (per his item #3) represent anything of interest.

> In practice it is common for the function to be the average running time of an algorithm.

In other words, Big-O is the worst-case of the input. You do the complexity analysis and then assign it to a Big-O class, the Big-O class represents the worst-case for that given analysis. If you analyze the average, its Big-O class will be the worst-case/upper-bound of the average and you obviously can't expect it to say anything about the actual full algorithm's worst case scenario because, well, you didn't analyze the algorithm's worst-case scenario.

Big-O is worst-case, you just have to know what it's the worst case of. If the input is just "here is the quicksort algorithm", then Big-O would be the upper-bound for the algorithm itself, aka the worst case for the algorithm, namely O(n²).

This seems to be just an issue of convention, does "Big-O" refer to the upper bound on the algorithm or on the average algorithm performance? That's been discussed elsewhere, but it seems like programmers imply the average while the strict definition does not. So my point was that the original article should have just said to be careful about vocabulary, not asserted that Big-O isn't about worst case, since it is.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: