anonymous
  • anonymous
Suppose f is continuous on [0, infinity) and limit of f(x) as x approaches infinity is 1. Is it possible that the integral of f(x) from 0 to infinity is convergent?
Mathematics
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
chestercat
  • chestercat
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
anonymous
  • anonymous
yes, yes it is.
anonymous
  • anonymous
\[\lim_{a \rightarrow \infty} \int\limits_{0}^{a} f(x)dx\]
anonymous
  • anonymous
can you show me the steps to proof that it is possible the integral is convergent? cos right now i only have the answer that it is not convergent by using integration by parts

Looking for something else?

Not the answer you are looking for? Search for more explanations.

More answers

anonymous
  • anonymous
Depending that F(x) integrates into a simple equation in terms of x, then yes it would converge because you would take x, replace it with a, and then use your limit and take it to infinity. Easy. It would diverge if the infinity would be in the denominator of a quotient and the denominator was going to infinity faster than the numerator. Just look at the big picture, or "in a long run".
anonymous
  • anonymous
If 1/x^p and p > 1, then it CONVERGES.
anonymous
  • anonymous
hi quantish, if you take f(x) as 1/x^p and p>1, what does it converge to? I get a divergent answer when i integrate that from 0 to infinity. Only when I integrate it from 1 to infinity do i get a convergent answer. thanks so much for all the help everyone.
anonymous
  • anonymous
I was just refering to the p-series test, and wasn't refering to a function in particular and only a limit as x --> infinity. If the denominator increases at a greater rate than the numerator i.e. 1/x^p when p > 1, then it should CONVERGE, not DIVERGE as was earlier stated.

Looking for something else?

Not the answer you are looking for? Search for more explanations.