The
clock rate typically refers to the
frequency at which a chip like a
central processing unit (CPU), one core of a
multi-core processor, is running and is used as an indicator of the
processor's speed. It is measured in
clock cycles per second or its equivalent, the
SI unit
hertz (Hz). The clock rate of the first generation of computers was measured in hertz or kilohertz (kHz), but in the 21st century the speed of modern CPUs is commonly advertised in GigaHertz (GHz). This metric is most useful when comparing processors within the same family, holding constant other features that may impact performance.
Video card and CPU manufacturers commonly select their highest performing units from a manufacturing batch and set their maximum clock rate higher, fetching a higher price.