Let f(x) be a polynomial of degree n with n distinct real roots. Prove that the polynomial
f(x)f′′(x)−(f′(x))2
has no real roots.
First note the degree of the expression is 2n-2. Since it is even, the result of no real roots is at least possible.
I decided to look at a relatively simple family of cubic polynomials:
f(x)=x^3-x+c
With a bit of calculus there is a local max/min at (+/-sqrt(3)/3,+/-2sqrt(3)/9+c) and thus this cubic has n real roots if -2sqrt(3)/9 < c < 2sqrt(3)/9.
The expression becomes -3x^4 + 6cx - 1, which has no real roots if c is small. Using synthetic division to find the values of c for which it does have roots, we get the same result as above:
c=+/- 2sqrt(3)/9
So it's proven for this one case. You could probably extend this to more general cubics, but I don't see how to extend to higher degree.
I also recon the problem could become even stronger if rewritten as
f(n) has n real roots if and only if the expression has no real roots.
https://www.desmos.com/calculator/mvoct27yet
|
Posted by Jer
on 2024-02-03 10:52:49 |