Since the end of World War II, the United States has come to dominate the world economically and politically, leading many to describe the United States as an empire. Scholars have …