# infinitesimal

An infinitesimal is a number that is greater than zero yet smaller
than any positive real number. In a
sense, infinitesimals are to small what infinity is to large. They were first introduced by Isaac Newton and Gottfried Leibniz in their early versions
of calculus; however, the lack of a rigorous definition for them stood in
the way of calculus being fully accepted.
As Bertrand Russell later put it:
"Calculus required continuity, and continuity was supposed to require the
infinitely little; but nobody could discover what the infinitely little
might be." In the 1800s, calculus was put on a firmer footing by Augustin Cauchy, Karl Weierstrass,
and others, who clarified and redefined the notion of a limit without reference to infinitesimals. When a function *f* (*x*)
can be made as close as desired to *L* by taking *x* close enough
to *a*, then *L* is the limit of *f* (*x*) as *x* approaches *a*. This is the classical or epsilon-delta formulation
of calculus, named for the common use of delta for |*x* - *a*|
and epsilon for |*f**x*) – *L*|. For a long time, this was
the only rigorous foundation for calculus, and it is still the one taught
in most calculus classes. But in 1960, Abraham Robinson discovered nonstandard analysis,
which provides a rigorous formulation of infinitesimals, confers on them
a new significance, and brings them closer to the vision of Newton and Leibniz.