A community for students. Sign up today!
Here's the question you clicked on:
 0 viewing
 2 years ago
Suppose that y=f(x) is differentiable at x=a and that g(x)=m(xa) + c is a linear function in which m and c are constants. If there error E(x)=f(x)g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(xa).
Show that if we impose on g the conditions:
1. E(a)=0
2. lim as x>a E(x)/(xa)=0
then g(x)=f(a)+f'(a)(xa). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to xa.
 2 years ago
Suppose that y=f(x) is differentiable at x=a and that g(x)=m(xa) + c is a linear function in which m and c are constants. If there error E(x)=f(x)g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(xa). Show that if we impose on g the conditions: 1. E(a)=0 2. lim as x>a E(x)/(xa)=0 then g(x)=f(a)+f'(a)(xa). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to xa.

This Question is Closed

SomeBloke
 2 years ago
Best ResponseYou've already chosen the best response.0Can you say what problem you're having with this? Where have you got stuck?
Ask your own question
Ask a QuestionFind more explanations on OpenStudy
Your question is ready. Sign up for free to start getting answers.
spraguer
(Moderator)
5
→ View Detailed Profile
is replying to Can someone tell me what button the professor is hitting...
23
 Teamwork 19 Teammate
 Problem Solving 19 Hero
 Engagement 19 Mad Hatter
 You have blocked this person.
 ✔ You're a fan Checking fan status...
Thanks for being so helpful in mathematics. If you are getting quality help, make sure you spread the word about OpenStudy.