In mathematics, specifically in the theory of generalized functions, the limit of a sequence of distributions is the distribution that sequence approaches. The distance, suitably quantified, to the limiting distribution can be made arbitrarily small by selecting a distribution sufficiently far along the sequence. This notion generalizes a limit of a sequence of functions; a limit as a distribution may exist when a limit of functions does not.
The notion is a part of distributional calculus, a generalized form of calculus that is based on the notion of distributions, as opposed to classical calculus, which is based on the narrower concept of functions.
Given a sequence of distributions , its limit is the distribution given by
for each test function , provided that distribution exists. The existence of the limit means that (1) for each , the limit of the sequence of numbers exists and that (2) the linear functional defined by the above formula is continuous with respect to the topology on the space of test functions.
More generally, as with functions, one can also consider a limit of a family of distributions.