Some trace the origins back to 1898 and the Spanish-American War, or even earlier to the War of 1812. And still others would say that imperial ambitions were even on the minds of some of the Founding Fathers. Regardless, there can be no doubt that today the United States of America is an empire.
It is probably safe to assume that most Americans do not think of their country as an empire. As a conservative in my younger years, I might have even labeled the suggestion as anti-American, rationalizing to myself: Sure, we may have strategic military bases around the world and we may use force at times, but it is only for benevolent purposes. We get the bad guys, give the country back to the good guys, and we leave. The US does not try to rule the world.
I was wrong.