In
probability theory and
information theory, the
mutual information (MI) of two
random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as bits) obtained about one random variable, through the other random variable. The concept of mutual information is intricately linked to that of
entropy of a random variable, a fundamental notion in information theory, that defines the "amount of information" held in a random variable.