anonymous
  • anonymous
In this video http://ocw.mit.edu/resources/res-18-005-highlights-of-calculus-spring-2010/highlights_of_calculus/the-exponential-function/ prof. strang multiplies two equations (series) e^x = 1+x+1/2x^2+... and e^X = 1+X+1/2X^2+... to prove that the result equals e^(x+X). But when I multiply the two equations (e^x)x(e^X) using the distributive rule, I get the following result: (1+x+1/2x^2)x(1+X+1/2X^2) =(1+X+1/2X^2)+(x+xX+1/2xX^2)+(1/2x^2+1/2Xx^2+1/4x^2 X^2) which does not equal e^(x+X) = 1+(x+X)+(1/2x^2+xX+1/2X^2). I would be very grateful if someone could point out where I went wrong! Thanks
MIT 18.01 Single Variable Calculus (OCW)
jamiebookeater
  • jamiebookeater
See more answers at brainly.com
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.

Get this expert

answer on brainly

SEE EXPERT ANSWER

Get your free account and access expert answers to this
and thousands of other questions

anonymous
  • anonymous
Assume f(x)=e^x, what you need to to know is that the exponential series is infinite, so multiplying several items of the f(x) and f(X) is not enough. The professor just checked the start point and several items. You're doing very well for the first 3 terms of f(x+X). The better result is to show the infinity of the series using "...", so the better result is (1+x+1/2x^2 +...)x(1+X+1/2X^2+...) =(1+X+1/2X^2)+(x+xX+1/2xX^2)+(1/2x^2+1/2Xx^2+1/4x^2 X^2)+... . If you adjust the order of the terms, it will be 1+(x+X)+(1/2x^2+xX+1/2X^2)+1/2xX^2+1/2Xx^2+1/4x^2X^2+...=1+(x+X) +(x+X)^2+1/2xX^2+1/2Xx^2+1/4x^2X^2+... For 1+(x+X) +(x+X)^2, they are equal to the first three items of f(x+X). For 1/2xX^2+1/2Xx^2+1/4x^2X^2+... , they would be the items of (x+X)^3 +...+(x+X)^n+... The professor went on to check the (x+X)^3, and it was proved. So he deduced that the result f(x)*f(X)=f(x+X). The deduction is not so strict as it doesn't show the case when x and X approach n. Here's another way to do it http://www.pa.msu.edu/~stump/champ/exp.pdf.

Looking for something else?

Not the answer you are looking for? Search for more explanations.