In
probability theory, the
expected value of a
random variable is intuitively the long-run average value of repetitions of the experiment it represents. For example, the expected value of a six-sided die roll is 3.5 because, roughly speaking, the average of an extremely large number of die rolls is practically always nearly equal to 3.5. Less roughly, the
law of large numbers guarantees that the
arithmetic mean of the values
almost surely converges to the expected value as the number of repetitions goes to infinity. The expected value is also known as the
expectation,
mathematical expectation,
EV,
average,
mean value,
mean, or
first moment.