Here's the question you clicked on:

55 members online
  • 0 replying
  • 0 viewing

jk_16

  • 3 years ago

Suppose that y=f(x) is differentiable at x=a and that g(x)=m(x-a) + c is a linear function in which m and c are constants. If there error E(x)=f(x)-g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(x-a). Show that if we impose on g the conditions: 1. E(a)=0 2. lim as x-->a E(x)/(x-a)=0 then g(x)=f(a)+f'(a)(x-a). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to x-a.

  • This Question is Closed
  1. SomeBloke
    • 3 years ago
    Best Response
    You've already chosen the best response.
    Medals 0

    Can you say what problem you're having with this? Where have you got stuck?

  2. Not the answer you are looking for?
    Search for more explanations.

    • Attachments:

Ask your own question

Sign Up
Find more explanations on OpenStudy
Privacy Policy