An application that does not require a precise result. For the simplest example, 1 plus 1 may result in 2.01 or 1.98. For many applications, including imaging and artificial intelligence, "almost correct" is good enough, and such computations use less computer time and energy. Contrast with
probabilistic computing.