Apr 18, 2004 21:34
I've wondered about this for a long time and it's kind of bothered me. Why do Americans take it upon themselves to be called "American". The name of our country is not "America", that's the name of our continent...North America. The name of our country is The United States of America. Yes, it included America...but, shouldn't be be called United Statsians or something stupid like that? Sure, American rolls off the tongue better. But, look at it this way...if you're from Canada, you call yourself Canadian. If you're from Mexico, you're a Mexican. Likewise, are there any countries in South America who consider themselves "Americans"? Would we not be pissed off if Brazilians gave up their Brazilian identity and started calling themselves "American"?
Maybe I'm completely wrong...but I just think that we're not the only Americans out there, so why do we act like America is the solitary name of our country? "I am an American". Well, no shit you're an American. So are Canadians. We need some new title, because referring to ourselves as Americans is kinda...wrong. If we're going to call ourselves Americans we should likewise be able to refer to Argentinians as our fellow Americans. Can anyone think of a better name for us "Americans"? Can we be called Statesmen?