in computer graphics, some people try to accumulate the transformations in 1 matrix. So A_{n+1} = T_{n} * A_{n}
where T_{n} is a small transformation like a rotation around an axis
They learn by experience that they also slowly accumulate errors and end up with a transformation matrix A that's no longer orthogonal and will skew the image.
Or people try to solve large lineair systems of floating points with a naive Gaussian elimination approx and end up with noise. Same with naive iterative eigen vector calculations.
And you get the classic associativity problem. If, by swapping things, we would have gotten the same expression, it would have been a contradiction, proving that addition was not commutative, but it is not the case here.
It means that if addition is not associative, which we know is the case, then your example doesn't prove that addition is not commutative.
Of course, I didn't prove that in general, addition is commutative, but it has no reason not to be.
I may be overlooking your point. x-x+1 is evaluated left to right like any other additive expression, and after the x-x part is evaluated, the intermediate result is zero. This would be the case with any numeric type, wouldn't it?
They learn by experience that they also slowly accumulate errors and end up with a transformation matrix A that's no longer orthogonal and will skew the image.
Or people try to solve large lineair systems of floating points with a naive Gaussian elimination approx and end up with noise. Same with naive iterative eigen vector calculations.