All about flooble | fun stuff | Get a free chatterbox | Free JavaScript | Avatars    
perplexus dot info

Home > Just Math
Non existence root (Posted on 2024-02-02) Difficulty: 4 of 5
Let f(x) be a polynomial of degree n with n distinct real roots. Prove that the polynomial

f(x)f′′(x)−(f′(x))2

has no real roots.

No Solution Yet Submitted by Danish Ahmed Khan    
No Rating

Comments: ( Back to comment list | You must be logged in to post comments.)
Some Thoughts a tiny bit of progress Comment 1 of 1
First note the degree of the expression is 2n-2.  Since it is even, the result of no real roots is at least possible.

I decided to look at a relatively simple family of cubic polynomials:
f(x)=x^3-x+c

With a bit of calculus there is a local max/min at (+/-sqrt(3)/3,+/-2sqrt(3)/9+c) and thus this cubic has n real roots if -2sqrt(3)/9 < c < 2sqrt(3)/9.

The expression becomes -3x^4 + 6cx - 1, which has no real roots if c is small.  Using synthetic division to find the values of c for which it does have roots, we get the same result as above:
c=+/- 2sqrt(3)/9

So it's proven for this one case.  You could probably extend this to more general cubics, but I don't see how to extend to higher degree.

I also recon the problem could become even stronger if rewritten as 
f(n) has n real roots if and only if the expression has no real roots.

https://www.desmos.com/calculator/mvoct27yet


  Posted by Jer on 2024-02-03 10:52:49
Please log in:
Login:
Password:
Remember me:
Sign up! | Forgot password


Search:
Search body:
Forums (0)
Newest Problems
Random Problem
FAQ | About This Site
Site Statistics
New Comments (12)
Unsolved Problems
Top Rated Problems
This month's top
Most Commented On

Chatterbox:
Copyright © 2002 - 2024 by Animus Pactum Consulting. All rights reserved. Privacy Information