You have two 3-bit sensors, A and B, that measure the same thing, whatever it is -- temperature of the room, radioactivity levels, whatever.
Both sensors are hooked up to the same CPU, which takes in the sensor readings.
You know that the sensors are designed so that their readings can be off by at most one bit.
We claim that if B knows that A has sent the CPU a 3-bit sequence, then B only needs to send 2 bits, and the CPU will be able to reconstruct B's 3-bit measurement, thereby conserving bandwidth.
How is this so?
Credit goes to RMMMM of Berkeley.
Let B's 3-bit sequence be abc. Then B sends the following 2-bit sequence: the first bit is 1 if a = b, and 0 otherwise; the second bit is 1 if b = c, and 0 otherwise.
The CPU can then create a similar 2-bit sequence based on A's 3-bit sequence, and compare.
If B's 2-bit sequence is the same as A's 2-bit sequence, then B's 3-bit sequence is the same as A's 3-bit sequence.
If B's 2-bit sequence differs from A's on the 1st bit, then B's 3-bit sequence differs from A's on the 1st bit.
If B's 2-bit sequence differs from A's on the 2nd bit, then B's 3-bit sequence differs from A's on the 3rd bit.
If B's 2-bit sequence differs from A's on both bits, then B's 3-bit sequence differs from A's on the 2nd bit.
For example, say A's reading is 011 and B's reading is 001. B sends the CPU the 2-bit sequence 10. Based on A's reading, the CPU creates the 2-bit sequence 01. Since the 2-bit sequences differ on both bits, the CPU determines that B's 3-bit sequence differs from A's on the 2nd bit, and thus B's 3-bit reading is 001.
|
Posted by tomarken
on 2014-07-15 10:30:45 |