The U.S. is often viewed as the global policeman--the country responsible for keeping things relatively peaceful on the planet. How did it get that job? Before WWII, it was Great Britain’s responsibility. Do citizens of other countries really want help from the U.S.? Or is it just an excuse for the U.S. to interfere in foreign affairs?