The fallacy lies in ignoring the fact that the number of x's being added is not constant. Not only is x changing; the number of x's is also changing.
Before discussing this further, we should dispose of a few less fundamental objections. For instance, it may be objected that the repeated summation definition of x2 is well-defined only for positive x. However, we could extend the definition by using repeated subtraction for negative x. Having made this extension, we are left with the same conundrum.
Another potential objection is that, since the function is defined only when x is an integer, it is not continuous, and therefore not differentiable. A function is said to be differentiable at a point if its derivative exists at that point. Consider, though, what happens if we extend the additive notation to cover positive real x.
For example, if x = 2.4, we write f(x) = x + x + 0.4x. If x = 2.5, we write f(x) = x + x + 0.5x, and so on.
Now, having restored continuity, we can again pose the question: why is the derivative of this function at x = 2.4 not equal to 1 + 1 + 0.4? The reason, as indicated above, is that we are ignoring the fact that the number of x's being added is also changing.
To make this even clearer, consider that the above extension is equivalent to the following definition: f(x) at x = 2.4 is defined as 2.4x, at x = 2.5 it is defined as 2.5x, and so on. In other words, at x = a, f(x) = a· a.
The derivative at x = a is defined as the limit, as h tends to zero, of [f(a+h) − f(a)] / h.
The fallacy lies in writing f(a+h) as a· (a+h). (From which f'(a) = a.)
The correct formulation is: f(a+h) = (a+h)· (a+h). Expanding, we find f'(a) = 2a, as expected!