Quantcast

A community for students. Sign up today!

Here's the question you clicked on:

55 members online
  • 0 replying
  • 0 viewing

jk_16

  • 2 years ago

Suppose that y=f(x) is differentiable at x=a and that g(x)=m(x-a) + c is a linear function in which m and c are constants. If there error E(x)=f(x)-g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(x-a). Show that if we impose on g the conditions: 1. E(a)=0 2. lim as x-->a E(x)/(x-a)=0 then g(x)=f(a)+f'(a)(x-a). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to x-a.

  • This Question is Closed
  1. SomeBloke
    • 2 years ago
    Best Response
    You've already chosen the best response.
    Medals 0

    Can you say what problem you're having with this? Where have you got stuck?

  2. Not the answer you are looking for?
    Search for more explanations.

    • Attachments:

Ask your own question

Ask a Question
Find more explanations on OpenStudy

Your question is ready. Sign up for free to start getting answers.

spraguer (Moderator)
5 → View Detailed Profile

is replying to Can someone tell me what button the professor is hitting...

23

  • Teamwork 19 Teammate
  • Problem Solving 19 Hero
  • You have blocked this person.
  • ✔ You're a fan Checking fan status...

Thanks for being so helpful in mathematics. If you are getting quality help, make sure you spread the word about OpenStudy.

This is the testimonial you wrote.
You haven't written a testimonial for Owlfred.