Here's the question you clicked on:
jk_16
Suppose that y=f(x) is differentiable at x=a and that g(x)=m(x-a) + c is a linear function in which m and c are constants. If there error E(x)=f(x)-g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(x-a). Show that if we impose on g the conditions: 1. E(a)=0 2. lim as x-->a E(x)/(x-a)=0 then g(x)=f(a)+f'(a)(x-a). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to x-a.
Can you say what problem you're having with this? Where have you got stuck?