English Wikipedia - The Free Encycl...
Download this dictionary
36-bit
Many early computers aimed at the scientific market use a 36-bit word length. This word length was long enough to represent positive and negative integers to an accuracy of ten decimal digits (35 bits would have been the minimum). It also allowed the storage of six alphanumeric characters encoded in a six-bit character code. Prior to the introduction of computers, the state of the art in precision scientific and engineering calculation was the ten-digit, electrically powered, mechanical calculator, such as those manufactured by FridenMarchant and Monroe. These calculators had a column of keys for each digit, and operators were trained to use all their fingers when entering numbers, so while some specialized calculators had more columns, ten was a practical limit. Computers, as the new competitor, had to match that accuracy. Decimal computers sold in that era, such as the IBM 650 and the IBM 7070, had a word length of ten digits, as did ENIAC, one of the earliest computers.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License