anonymous
  • anonymous
why would the average speed of a round trip be less then the average of 2 trips
Mathematics
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
jamiebookeater
  • jamiebookeater
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
Xishem
  • Xishem
It wouldn't. The average velocity would be different, but the average speed would be the same.
anonymous
  • anonymous
i disagree for example a person drives 45 miles at 30 miles an hour and drives back at 60 miles an hour the formula for average speed is total distance/total time driven and i calculated that the average speed of the round trip was 40 miles an hour while the average speed was 30 miles and 60 miles each trip
anonymous
  • anonymous
you take the average of the two trip and you get 45 miles

Looking for something else?

Not the answer you are looking for? Search for more explanations.

More answers

anonymous
  • anonymous
oh and the speeds are constant
anonymous
  • anonymous
because average speed means \[\frac{\text{total distance}}{\text{total time}}\] not \[\frac{\text{speed going + speed returning }}{2}\]
Xishem
  • Xishem
I was assuming that the question meant that the speeds of the 2 trips and the round trip were the same. And I think that's a pretty good assumption to make in the case of this question.
anonymous
  • anonymous
take an extreme example and you will easily see why. suppose i travel 60 miles at 60 miles per hour and then 60 miles at 1 mile per hour. the total time for the trip was 61 hours, so my average speed was only \[\frac{120}{61}\] a little less than 2 miles per hour. but the average of the numbers 60 and 1 is \[\frac{61+1}{2}=\frac{61}{2}=30.5\]
anonymous
  • anonymous
well actually i meant \[\frac{60+1}{2}=\frac{61}{2}=30.5\] but you get the idea
anonymous
  • anonymous
if you want to do the problem for real, note that if you drive half way at one speed and half way at the other, it makes no difference how far you go, the average speed will remain the same.
anonymous
  • anonymous
im curious on specifically why this happens is solely because the two equations are different are is something else that would effect the answer?
anonymous
  • anonymous
let us imagine for a minute that you travel m miles at 60 miles an hour and another m miles at 30 miles an hour. you total distance was 2m miles, total time is \[\frac{m}{60}+\frac{m}{30}=\frac{3m}{60}\]
anonymous
  • anonymous
then to find your average speed you take distance divided by time to get \[\frac{2m}{\frac{3m}{60}}=2m\times \frac{60}{3m}\] the miles cancel and you get 40 miles per hour
anonymous
  • anonymous
where did the 3m come from?
anonymous
  • anonymous
i added the fraction needed a common denominator of 60
anonymous
  • anonymous
\[\frac{m}{60}+\frac{m}{30}=\frac{m}{60}+\frac{2m}{60}=\frac{3m}{60}\]
anonymous
  • anonymous
oh ok so i now it all makes sense
anonymous
  • anonymous
thank you

Looking for something else?

Not the answer you are looking for? Search for more explanations.