In
computer science, the
time complexity of an
algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the
string representing the input. The time complexity of an algorithm is commonly expressed using
big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described
asymptotically, i.e., as the input size goes to infinity. For example, if the time required by an algorithm on all inputs of size
n is at most for any
n (bigger than some
n0), the asymptotic time complexity is O(
n3).