1 dollar = 100 cents = 10 cents * 10 cents = 1/10 dollar * 1/10 dollar = 1/100 dollar = 1 cent
Here is the proof that 1 dollar = 1 cent. Is there an error in my calculations?
It really is incorrect to write 1 foot = 12 inches simply because 1 is not equal to 12. One should write instead 1 foot ~ 12 inches where by ~ one means "is equivalent to."
It is OK to scale such an equivalence by multiplying both sides by the same number, but one cannot multiply each side by a different number. So 1 dollar ~ 100 cents does imply 1/100 dollar ~ 1 cent (multiplying both sides by 1/100), but does not imply 1 dollar ~ 1 cent (multiplying the right side by 1 but multiplying the left side by 1/100).
Edited on April 10, 2004, 3:25 pm
|
Posted by Richard
on 2004-04-10 14:13:01 |