Define:
d/dx(f(x)) = f'(x)
where f'(x) = gradient (or slope) of f(x) at x = x.
Prove that:
d/dx(x^2) = 2x
without using calculus.
Not sure if this is what Cheradenine had in mind, but here's the classical way of solving this:
Slope is the ratio of how far something moves along the y-axis to how far it moves along the x-axis. For a plane old line, you can determine this by comparing two points (x1, y1) and (x2, y2) and dividing the y offset by the x offset: (y2 - y1)/(x2 - x1). This works for any two points on a line because the slope never changes.
For a nonlinear function, you can use the same basic idea. However, because the slope changes, you have to approximate it by comparing two points that are very close to the x value where you want to determine the slope. The closer the points, the more accurate the approximation. If the points are infinitely close, you have an exact answer.
For f(x) = x², we use the points (x, f(x)) and (x + d, f(x + d)) where d is some nonzero number. Computing the slope, we get:
(f(x + d) - f(x)) / (x + d - x)
= (x² + 2xd + d² - x²) / d
= 2x + d.
The exact solution uses an infinitely small value for d, which gives d/dx(x²) = 2x.