Recently the media has placed a spotlight on how ‘rude’ Americans appear to have become. Many cultures outside the US feel Americans are rude and arrogant. Personally, I feel there is more than just a grain of truth to this allegation. There is not a workday that goes by that I don’t encounter aggressive behavior from customers and coworkers. The norm seems to be to ‘challenge’ everything and be responsible for nothing. No one wants to hear the word ‘no’ and everyone wants to be accomadated. When I do encounter polite, mature behavior, it’s a breath of fresh air. Anyone have an idea on why manners seem to have flown out the window in this country thats supposed to be steeped in diversity and tolerance?
User Detail :
Name : Alma31459, Gender : F, Sexual Orientation : Lesbian, Race : White/Caucasian, Religion : Methodist, Age : 48, City : Kempner, State : TX, Country : United States, Occupation : govt employee, Education level : 4 Years of College, Social class : Lower middle class,