Suppose you’re working on an algebraic expression that involves variables, addition, multiplication, and parentheses. You try repeatedly to expand it using the distributive law.
How do you know that the expression won’t continue to expand forever?
For example, expanding
(x + y)(s(u + v) + t)
gives
x(s(u + v) + t) + y(s(u + v) + t),
which has more parentheses than the original expression.
I am not clear at all why this is difficulty 4. I await more formal proofs with great interest.
My D1 argument:
It seems somewhat obvious that if you apply the distributive rule first to parentheses that are more deeply embedded, and which do not contain other parentheses, that you will reduce the number of parentheses at every turn, eventually arriving at an expression with no parentheses. Since the final expression must be independent of the order in which operations are carried out, we are guaranteed that the expression cannot continue to expand forever, and in fact must eventually result in an expression with no parentheses.