Term of the Moment

hub


Look Up Another Term


Definition: 32-bit computing


CPUs that process 32 bits as a single unit, compared to 8, 16 or 64. Although 32-bit CPUs were used in mainframes as early as the 1960s, personal computers began to migrate from 16 to 32 bits in the 1980s. Starting with the first 32-bit 386 chips in 1985, Intel x86 CPUs were built with a 16-bit mode for compatibility with 16-bit applications (see 386).

The 32-bit mode does not result in two times as much real work getting done as in 16-bit mode, because it relates to only one aspect of internal processing. The CPU's clock speed, along with the speed, size and architecture of the disks, memory and peripheral bus all play important roles in a computer's performance (see throughput). See 8-bit computing, 16-bit computing, 64-bit computing, 128-bit computing and bit specifications.