NOTE: I'm not out to offend any Americans. This thread is meant for discussion of a certain perception of the United States. No disrespect to Americans is intended.
The United States is the world's richest and most powerful country. There's no question the U.S. holds a high profile on the world stage.
There are those who would have you believe this is the most advanced and innovative country on the planet, leading the world in most major areas.
But is the U.S. really the most advanced country when it comes to digital technology, environmental issues, health care and university education, amongst other arenas?
The U.S. regularly ranks out of the top 10 on the "Best places to live" list put out each year and fares even less well on surveys relating to quality of health care, but still folks will tell you the U.S. is on the "cutting edge" of a lot of different things and sets the standard that other nations follow.
The U.S. pays more for internet than other countries and more for cellular service, yet U.S. telecoms will still tell you America has the best wireless service and device selection anywhere in the world.
The Ivy League schools in the United States are considered some of the world's best, but the tuition constantly goes up and those institutions become more and more elitist. Is a Harvard Degree really better than one attained at say, Oxford?
There's a notion that major bands, actors and other performers haven't really "made it" unless they've made it big in the U.S. Of course there have been many international acts that have done well in other countries and not in the U.S., but "making it in america" is still used by many to judge the success of a performer or act. The idea that success in the U.S. is a high plateau as one can get to in the entertainment industry has been around quite awhile.
The U.S. dollar has long been the currency against which others are guaged. Using the United States as the "gold standard" like this further drives home the belief among some that the U.S. is supposed be above and beyond all the rest.
So I'm wondering, how you folks feel about that?
The United States is the world's richest and most powerful country. There's no question the U.S. holds a high profile on the world stage.
There are those who would have you believe this is the most advanced and innovative country on the planet, leading the world in most major areas.
But is the U.S. really the most advanced country when it comes to digital technology, environmental issues, health care and university education, amongst other arenas?
The U.S. regularly ranks out of the top 10 on the "Best places to live" list put out each year and fares even less well on surveys relating to quality of health care, but still folks will tell you the U.S. is on the "cutting edge" of a lot of different things and sets the standard that other nations follow.
The U.S. pays more for internet than other countries and more for cellular service, yet U.S. telecoms will still tell you America has the best wireless service and device selection anywhere in the world.
The Ivy League schools in the United States are considered some of the world's best, but the tuition constantly goes up and those institutions become more and more elitist. Is a Harvard Degree really better than one attained at say, Oxford?
There's a notion that major bands, actors and other performers haven't really "made it" unless they've made it big in the U.S. Of course there have been many international acts that have done well in other countries and not in the U.S., but "making it in america" is still used by many to judge the success of a performer or act. The idea that success in the U.S. is a high plateau as one can get to in the entertainment industry has been around quite awhile.
The U.S. dollar has long been the currency against which others are guaged. Using the United States as the "gold standard" like this further drives home the belief among some that the U.S. is supposed be above and beyond all the rest.
So I'm wondering, how you folks feel about that?
Comment