jbecker007
New member
- Joined
- Jan 29, 2020
- Messages
- 4
The problem:
A driver sets out on a 20-mile trip. When he has gone halfway, he finds he has averaged 25 mph. At what speed must he travel the rest of the way to make his overall average speed for the trip 40 mph?
I'm having trouble figuring out how to start this. He travels 10 miles at 25 miles per hour, which takes him 2/5 of an hour (24 minutes). Not sure if the time is relevant. I tried setting up an equation: r1 + r2 = 40, but this doesn't seem to be working either.
Any help would be deeply appreciated!
JB
A driver sets out on a 20-mile trip. When he has gone halfway, he finds he has averaged 25 mph. At what speed must he travel the rest of the way to make his overall average speed for the trip 40 mph?
I'm having trouble figuring out how to start this. He travels 10 miles at 25 miles per hour, which takes him 2/5 of an hour (24 minutes). Not sure if the time is relevant. I tried setting up an equation: r1 + r2 = 40, but this doesn't seem to be working either.
Any help would be deeply appreciated!
JB