Prove that for all nonnegative integers a and b, such that 2a² + 1 = b², there are two nonnegative integers c and d such that 2c² + 1 = d² and a + c + d = b, or give a counterexample.
(For example if a = 0 and b = 1, or a = 2 and b = 3 then c=0 and d = 1.)
(a + c + d ) = b --------> (c + d) = (b - a)
a^2 + c^2 + d^2 + 2 * (ac +ad + cd) + acd = b^2
b^2 = 2a^2 + 1
d^2 = 2c^2 + 1
a^2 + c^2 + (2c^2 + 1) + 2a(c+d) + acd + 2a = (2a^2 + 1)
3c^2 - a^2 + 2a(b - a) + acd + 2a = 0
3c^2 - a^2 + 2ab - 2a^2 + acd + 2a = 0
3c^2 - 3a^2 + 2ab + acd + 2a = 0
3(c+a)(c-a) + a(2b + cd + 2) = 0
Since a, b, c, and d are nonnegative integers, the first term must be negative, so c < a.
Making a = c + r ------> c = (a - r), r > 0.
2ab + acd + 2a = 3(c+a)(c-a)
2ab + a(a-r)d + 2a = 3(2a-r)(-r)
2ab + da(a-r) + 2a = - 6ar + 3r^2
d = (- 6ar + 3r^2 - 2ab - 2a)/a(a-r)
3r^2 - 6ar - 2a(b+1) >=0
r = [6a +- sqrt (36a^2 + 24a(b+1))]/6
So, it works only for
36a^2 + 24ab + 24 >=0 and a square, and that makes r integer.
With the value of r, we found the value of c = a - r, and d = b - a - c = b - a - a + r = b - 2a + r.
ThereŽs some mistake in my accounts, but I think that reasoning like this we find the answer.
Where I made a mistake ? IŽll post this and review my calculus.
|
Posted by pcbouhid
on 2005-07-13 18:23:54 |