1 dollar = 100 cents = 10 cents * 10 cents = 1/10 dollar * 1/10 dollar = 1/100 dollar = 1 cent
Here is the proof that 1 dollar = 1 cent. Is there an error in my calculations?
The problem lies in the step 100 cents = 10 cents * 10 cents. That is false. 100 cents = 10 * 10 cents. Going out on a limb, dimensional analysis could even say 100 cents = 10 cents^(1/2) * 10 cents^(1/2), but not 10 cents * 10 cents. Following that weird line each of the 10 cents^(1/2) would be 1 dollar^(1/2) as √dollar = √(100 cents) = 10 √cent. Then 1 dollar^(1/2) * 1 dollar^(1/2) is still 1 dollar, what we started with.
In other words the dimension of a number is just like a factor of that number--you can't just throw another one in. It would be like saying 100 inches = 10 inches * 10 inches = 100 in². It just isn't, as linear measure is not area measure. Likewise I wouldn't even know what a square cent is, but whatever it would be it's not the same as a cent.
|
Posted by Charlie
on 2003-05-03 03:57:28 |