A
randomized algorithm is an
algorithm that employs a degree of
randomness as part of its logic. The algorithm typically uses
uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random bits. Formally, the algorithm's performance will be a
random variable determined by the random bits; thus either the running time, or the output (or both) are random variables.