## anonymous one year ago An object is dropped from rest (initial velocity is zero m/s) and falls freely. After 5 seconds, how far has it fallen? Select one: a. 100 m b. 75 m c. 125 m

1. jameshorton

If we drop an object near the surface of the Earth, it falls towards the surface with constant acceleration, g. This means that, as it falls, it speeds up (specifically, its speed increases by 9.8 m/s for each second it is falling). If we measure the distance fallen, we find that the object has dropped 4.9 meters after 1 second, 18.6 m after 2 sec, 44.1 m after 3 sec, etc. The equation for the distance fallen after t seconds is d = 1/2 gt2 That's for an object dropped from rest, but what if the object is thrown? Without gravity, we know that an object that is thrown with a certain initial velocity will continue in a straight line at constant velocity (Newton's first law). The distance it travels in each second will be the same as it travelled the previous second (or in any other second). The projectile's motion can be thought of as a combination of this very simple motion with the "free fall" motion described above. In other words, figure out the straight line, constant speed motion that the projectile would have if there was no gravity to figure out the position that the object would be at without gravity. The effect of gravity is that the object will fall vertically below that straight line path by a distance given by the free fall distance, as shown in the figure. We can apply this to an object thrown horizontally. Without gravity, it would travel in a horizontal line at constant speed. With gravity, it falls below that horizontal line. So if I throw a baseball horizontally at 1 m/s, it will travel a horizontal distance of 1 m in the first second of travel, but it will fall a vertical distance of 4.9 m below that horizontal line in that same second. It will therefore be 4.9 m closer to the ground after 1 second of travel (assuming the ground is level, which is the same thing as horizontal). No matter what speed I throw the ball, it will fall vertically the same distance of 4.9 m in the first second. The horizontal distance travelled will depend on how fast I throw, but the vertical distance will not. So if I throw at 5 m/s, it will travel 5 m in the first second, if I throw at 50 m/s it will travel 50 m, and if I throw at 500 m/s it will travel 500 m, all while falling 4.9 m below the original horizontal line.No air resistance or friction to slow the ball down Gravity is constant everywhere The surface of the Earth is flat Maybe we should re-examine some of them. For now, let's keep the first two, but reconsider the third one. We know the Earth isn't flat, but spherical. How far does the projectile have to travel before this starts to have an effect on things? It turns out that we can approximate the Earth as perfectly flat as long as we are dealing with horizontal distances less than about a kilometer.so thats fine for now.What if the projectile is launched ar 5000 m/s? Then it will travel a horizontal distance of 5000 m (5 km) in the first second. By the end of that time it will still have fallen 4.9 m below the horizontal, but because of the curvature of the Earth, it will not be 4.9 m closer to the ground. The level of the ground will have dropped as well, so the height of the ball will have decreased by less than 4.9 m. In fact, if we throw fast enough, say at 10,000 m/s, we will find that the ground has dropped away more than 4.9 m, so the height of the ball above the ground will actually have increased, even though it has still fallen 4.9 m. If we pick just the right initial speed (about 7900 m/s near the Earth's surface), we can arrange that the height above the ground will still be the same after 1 second. hope this helps

2. Abhisar

Well, when we drop something its initial velocity is 0. We can use the equation to find the distance traveled $$\boxed{\sf s=ut+0.5 \times g \times t^2}$$ s=distance traveled t=time g=9.8 $$\sf m/s^{-2}$$ u= initial velocity (which is 0 in this case)