A community for students.
Here's the question you clicked on:
 0 viewing
anonymous
 5 years ago
The sum from 1 to infinity of (x+5)/(x^2) converges. How do you prove this?
anonymous
 5 years ago
The sum from 1 to infinity of (x+5)/(x^2) converges. How do you prove this?

This Question is Closed

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0Pseries. Since the exponent of the denominator is >1, the Series must converge.

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0And I don't have to do a comparison test because the numerator isn't 1?

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0Unless you need to prove conditional convergence or absolute convergence

amistre64
 5 years ago
Best ResponseYou've already chosen the best response.0the bottom become larger than the top: 1  = .00000...000001 1000000...00000 is a very small number.

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0Nah, I don't have to prove conditional vs. absolute convergence for this, just convergence vs. divergence.

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0The numerator either being 1 or not being one doesn't really matter to the comparison test. The idea of the comparison test is to prove that a sequence that is smaller diverges or that a sequence that is larger converges. If you find either one of those conditions, you have proven the convergence or divergence of the series you are discussing.

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0I thought it was the other way around? Proving that the series that's smaller than the known convergent one also converges or the series that's larger than the known divergent one diverges. I might be splitting hairs at this point, though. Thanks.

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0I am talking about the sequence, not the series. The relationship between the sequence and the series is the fact that if your sequence is anything but zero, you are adding to your series (since the summation of your sequence is your series after all)

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0When a series is convergent, it means that it's sequence either becomes zero at a point or it increases too slowly to really add to the series. (.1 + .01 + .001 etc....) Is actually convergent even if the sequence never stops growing.

anonymous
 5 years ago
Best ResponseYou've already chosen the best response.0Okay. I'll use the pseries. Thanks for your help!
Ask your own question
Sign UpFind more explanations on OpenStudy
Your question is ready. Sign up for free to start getting answers.
spraguer
(Moderator)
5
→ View Detailed Profile
is replying to Can someone tell me what button the professor is hitting...
23
 Teamwork 19 Teammate
 Problem Solving 19 Hero
 Engagement 19 Mad Hatter
 You have blocked this person.
 ✔ You're a fan Checking fan status...
Thanks for being so helpful in mathematics. If you are getting quality help, make sure you spread the word about OpenStudy.