In
computability theory and
computational complexity theory, a
model of computation is the definition of the set of allowable operations used in computation and their respective costs. It is used for measuring the complexity of an
algorithm in
execution time and or
memory space: by assuming a certain model of computation, it is possible to analyze the computational resources required or to discuss the limitations of algorithms or computers.