This is just a follow-up in response to a couple of requests for a solution to last week’s problem about hitting a golf ball on the Moon. Given the speed of the club face m/s, and the acceleration due to gravity
m/s^2, at what loft angle
should a golfer hit a golf ball on the Moon to maximize the distance the ball travels?
It is helpful to consider two different frames of reference, one to handle the “bounce” of the ball off the club, and another to handle the resulting flight of the ball:
First, in the “club frame,” the club face is fixed, and the ball approaches the club at speed . Assuming a perfectly elastic collision with a negligible mass ratio, the ball bounces off the club face with equal angles of incidence and reflection, leaving at the same speed. As the following figure shows, the ball is launched at angle
… in this frame.
However, in the “Moon frame,” the club face– and thus everything in the figure above– is also moving to the right at speed . So the resulting velocity of the ball is
At this point, we have a straightforward projectile motion calculus problem. Assuming this initial velocity, we compute the distance traveled, then maximize as a function of .
(An interesting but non-obvious side note: the “actual” launch angle of the ball– in the Moon frame– ends up equaling the loft angle! (Why?) That is, for example, if we hit the golf ball with a loft angle of 45 degrees, then the ball’s initial trajectory will also be 45 degrees. However, it will not be traveling as fast as it would if we hit it at a lower angle. The problem is to determine the “sweet spot” between lowering the angle to increase speed, and raising the angle to increase time of flight.)
I’ll skip most of the details. The distance the ball travels is given by
The maximum distance is realized when the loft angle is 30 degrees, in which case the ball travels about 2560 meters, or about 1.6 miles.
