Open study

is now brainly

With Brainly you can:

  • Get homework help from millions of students and moderators
  • Learn how to solve problems with step-by-step explanations
  • Share your knowledge and earn points by helping other students
  • Learn anywhere, anytime with the Brainly app!

A community for students.

Fine the root of the equation f(x)=2x(1−x^2+x)ln(x)-x^2+1 in the interval [0, 1] by Newton’s method where x0=0.5... The problem I am getting is that when I calculate x1 = x0 - f(x)/f'(x), I get 0, but the root is at 0.32... ideas?

I got my questions answered at in under 10 minutes. Go to now for free help!
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.

Join Brainly to access

this expert answer


To see the expert answer you'll need to create a free account at Brainly

Are you sure that's the right equation? That has no roots in the interval [0,1].
Yep, that is what is on my assignment. The original ? was to write a computer program that iterates through 5 times and calculates the root using Newton's method, but my program works for everything except this function.
If the equation is\[2x(1−x^2+x)\ln(x)-x^2−1,\] there are no roots. If it's \[2x(1−x^2+x)(\ln(x)-x^2−1),\]there is a root.

Not the answer you are looking for?

Search for more explanations.

Ask your own question

Other answers:

well the original function was f(x)=2x(1-x^2+x)ln(x)=x^2-1, not sure if that changes things?
Ah. Then you should get \[2x(1-x^2+x)\ln(x)-x^2\large{\bf +1}.\] That might work better
yeah I made a typo in the original question... still I get stuck calculating x1 being equal to 0
Let me see what I get.
I get x1 = 0.5 - (-0.116434)/(-0.232868), which is 0.5 - 0.5 = 0
Hmm. I am getting the same thing as well. Perhaps try x=.49 or something.
it specifically says use x0=0.5. I just think my professor didn't double check his own work.
That's a likely guess. The sequence also doesn't converge to anything is \(x_0>.5\), so you might try something just below .5 after noting that \(x_0=.5\) sent you straight to 0.
yeah choosing x0 = 0.4 works nicely. Hopefully he acknowledges his mistake and gives me credit for it. Thanks for taking the time to look at the problem though.
No problem.

Not the answer you are looking for?

Search for more explanations.

Ask your own question