Your first post doesn't show how you factored... it just appeared. Because having the roots makes the problem trivial, I think the point is to either assume two do add up to 2 and factor.
Here is my factoring solution:
Assume that two of the roots add up to 2. That means one equation is of the form (x^2-2x+a) and since the degree of x^3 is 0, the other term must be of the form (x^2+2x+b); also (a+b-4)x^2=0x^2 and 2xa-2xb=12, ie a+b=4, a-b=6, implying a=5 and b=-1. Since this checks with ab=-5, it means our assumption is true.
Another way to do it is by doing long division by (x^2-2x+a) to get a remainder of (-4a+20)x+(aČ-4a-5) Both of these must be equal to 0, and solving the first term gives a=5, which is a root of the second term.
|
Posted by Gamer
on 2006-11-21 00:20:32 |