sensitivity index


English Wikipedia - The Free EncyclopediaDownload this dictionary
Sensitivity index
The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, compared against the standard deviation of the signal plus noise distributions. For normally distributed signal and noise with mean and standard deviations and , and and , respectively, d' is defined as:

An estimate of d' can be also found from measurements of the hit rate and false-alarm rate. It is calculated as:
d' = Z(hit rate) - Z(false alarm rate),

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License