anonymous
  • anonymous
Suppose that y=f(x) is differentiable at x=a and that g(x)=m(x-a) + c is a linear function in which m and c are constants. If there error E(x)=f(x)-g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(x-a). Show that if we impose on g the conditions: 1. E(a)=0 2. lim as x-->a E(x)/(x-a)=0 then g(x)=f(a)+f'(a)(x-a). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to x-a.
MIT 18.01 Single Variable Calculus (OCW)
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
katieb
  • katieb
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
anonymous
  • anonymous
Can you say what problem you're having with this? Where have you got stuck?

Looking for something else?

Not the answer you are looking for? Search for more explanations.