jarofthoughts
Empirical Curmudgeon
Just curious about something that is sometimes portrayed by the media...
1. Do most Americans think that USA is the best country in the world?
2. Is there any rational thought behind this claim by those who make it?
1. Do most Americans think that USA is the best country in the world?
2. Is there any rational thought behind this claim by those who make it?