In studying the ASYMPTOTIC efficiency of algorithms, use only input sizes large enough to make only the order of growth of the running time relevant. The running time of an algorithm increases with the size of the input, as the size of the input increases without bound. As input size becomes extremely large, only the highest degree polynomial terms are of any significance, and even with ignoring constant factors, the growth rates themselves will provide a excellent indicator of whether a given algorithm will be able to run in a reasonable amount of time on a problem of a given size. Big Oh notation and worst-case analysis are tools that greatly simplify our ability to compare the efficiency of algorithms.