In
mathematics,
big O notation describes the
limiting behavior of a
function when the argument tends towards a particular value or infinity, usually in terms of simpler functions. It is a member of a larger family of notations that is called
Landau notation,
Bachmann–Landau notation (after
Edmund Landau and
Paul Bachmann), or
asymptotic notation. In
computer science, big O notation is used to
classify algorithms by how they respond (
e.g., in their processing time or working space requirements) to changes in input size. In
analytic number theory, it is used to estimate the "error committed" while replacing the asymptotic size, or asymptotic mean size, of an
arithmetical function, by the value, or mean value, it takes at a large finite argument. A famous example is the problem of estimating the remainder term in the
prime number theorem.