Processing math: 100%

Friday, 9 January 2015

Integration and statistics

Have you ever wondered why you see many integration problems of the form \int xf(x)dx or \int (x-\text{a number})^2 f(x)dx in your textbook?

Those problems are related to expected value and variance, which are applied in statistics.

Prerequisite knowledge:
For a discrete random variable, the expected value is the sum of all xP(x). For a continuous random variable, P(x) is the probability density function, and integration takes the place of addition.

Definitions:
Let X be an absolutely continuous random variable with probability density function f_X(x).

The expected value of X is:
E(X)=\int_{-\infty}^{\infty}xf_X(x)dx

It should fulfill absolute integrability, i.e. \int_{-\infty}^{\infty}|x|f_X(x)dx<\infty. This ensures that the improper integral \int_{-\infty}^{\infty}xf_X(x)dx, a shorthand for \lim\limits_{t \to -\infty} \int_t^0 xf_X(x)dx+\lim\limits_{t \to \infty} \int_0^t xf_X(x)dx, is well-defined. When the absolute integrability condition is not satisfied, the expected value of X does not exist.

Let f(x) be a probability density function on the domain [a,b], then the variance of f(x) is \int_a^b(x-\mu)^2 f(x)dx.

Examples

No comments:

Post a Comment