An
analog computer is a form of
computer that uses the continuously changeable aspects of physical phenomena such as
electrical,
mechanical, or
hydraulic quantities to
model the problem being solved. In contrast,
digital computers represent varying quantities symbolically, as their numerical values change. As an analog computer does not use discrete values, but rather continuous values, processes cannot be reliably repeated with exact equivalence, as they can with
Turing machines. Analog computers do not suffer from the
quantization noise inherent in digital computers, but are limited instead by
analog noise.