A similar one I enjoy: you're driving two miles. After one mile, you've averaged 30MPH. How fast do you need to drive the second mile to have a trip average of 60MPH?
Answer: impossible. You'd have to travel the second mile at infinite speed to achieve this.
If you drive the first mile at 30 MPH and the second at 90 MPH you averaged 60 MPH ((30+90)/2). What is the math that gets "infinite speed" as the answer?
You're averaging over distance, but to calculate the average speed you need to average over time. If you drove the first mile at 30 MPH and the second at 90 MPH, you'd spend 120 seconds on the first mile and 40 seconds on the second mile, thus giving an average speed of
Helps to think about it like this: if you averaged 60 MPH on a two mile drive, your trip would last two minutes. Since your first mile was driven at 30 MPH you've already been driving for exactly two minutes. The only way to hit an average of 60 MPH for the whole trip is if the second mile is instantaneous.
Put it this way: If you were to average 60 mph over 2 miles, you would travel those two miles in 2 minutes.
However, if you've already driven the first mile at 30 mph, it's taken you two minutes to drive it. In order to then average 60 mph over the full two miles, you'd have to travel the remaining mile in 0 seconds -- in zero time.
For your method to work you need to add the time spent at speed, not the miles covered.
In your proposed solution, the first mile takes two minutes, and the second mile forty seconds. That's 2:40 to cover two miles, or an average speed of 45 MPH.
Answer: impossible. You'd have to travel the second mile at infinite speed to achieve this.