O(1) means that the time taken to perform an algorithm is a constant, whatever the input size. O(n) means that the time is a multiple of the input size (plus a possible constant).
Algorithm complexity is interesting as a tool to compare how algorithm scale with the input size. But what this tool doesn't tell you, is which algorithm is the fastest. An O(1) algorithm that takes one hour to complete is worthless compared to an O(n) algorithm that takes seconds to do the same for reasonable input sizes.
|