You have two 3-bit sensors, A and B, that measure the same thing, whatever it is -- temperature of the room, radioactivity levels, whatever.
Both sensors are hooked up to the same CPU, which takes in the sensor readings.
You know that the sensors are designed so that their readings can be off by at most one bit.
We claim that if B knows that A has sent the CPU a 3-bit sequence, then B only needs to send 2 bits, and the CPU will be able to reconstruct B's 3-bit measurement, thereby conserving bandwidth.
How is this so?
Credit goes to RMMMM of Berkeley.