In this video http://ocw.mit.edu/resources/res-18-005-highlights-of-calculus-spring-2010/highlights_of_calculus/the-exponential-function/ prof. strang multiplies two equations (series) e^x = 1+x+1/2x^2+... and e^X = 1+X+1/2X^2+... to prove that the result equals e^(x+X). But when I multiply the two equations (e^x)x(e^X) using the distributive rule, I get the following result: (1+x+1/2x^2)x(1+X+1/2X^2) =(1+X+1/2X^2)+(x+xX+1/2xX^2)+(1/2x^2+1/2Xx^2+1/4x^2 X^2) which does not equal e^(x+X) = 1+(x+X)+(1/2x^2+xX+1/2X^2).
I would be very grateful if someone could point out where I went wrong! Thanks

Hey! We 've verified this expert answer for you, click below to unlock the details :)

I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!

Looking for something else?

Not the answer you are looking for? Search for more explanations.