The only thing I resent is how the rest of the world refer to the U.S. as America if America is a much bigger thing than only a country. I understand they started calling themselves America in a self-centered way towards the rest of the continent as if the only thing of value in the whole hemisphere was their country.
What most people in the U.S. (not America) dont know is the atrocities their government has done to the rest of America, the forgotten America that includes central and south America.
I live in Puerto Rico, a spoil from the Spanish-American war in 1898. Thats the date I consider that the USA started their empire-like ascent.
The history of the other America is very sad, and the U.S. is very much responsible for this. They have overthrown and replaced presidents as if we were part of their backyard. But the rest of America has been pretty much unheard of in the U.S.. As [Latin American author] Gabriel Garcia Marquez puts it, the other America has been alone for the longest time and Europe hasnt noticed.
In the 20th century alone, the U.S. has invaded different countries in the rest of America at least 30 times in the name of justice and democracy—but the truth is they were squashing any popular movement that even looked like anti-capitalist in their bipolar paranoid view of the world during the cold war.
They have trained, sold weapons and given military assistance to some of the most cruel dicatators of the other America. They have even helped to squash democratically elected presidents (I refer especially to Salvador Allende, who was deposed with help from the CIA; even a U.S. diplomat became involved in the taking of Chiles presidential palace) because they didnt share the U.S. politics for the region (which was to let their private companies at large to exploit the natural resources of that other America).
And when those dictators do not comply with their requests, all of a sudden they become public enemies, and they also oust them. Ask Noriega from Panama, who used to be best friends with U.S..
The U.S. have even, in practical terms, forbidden the rest of the world to participate of the natural riches of America, as if we were their private farm.
Im sorry to say the U.S. is not this benevolent father or referee they pretend to be—not now, not in the last century.
Nobody has noticed, simply because when somebody says America is suffering they look at the U.S. and say. They look perfectly fine to me. Nobody understands what part of America is being refered to.
I think it is time everybody starts calling nations by their real name. The U.S. is not America. The U.S. belongs to America, although they believe things are reversed, and act like America belongs to the U.S.