Somebody recently told me that if black people had colonized America instead of whites, there never would have been slavery or anything resembling segregation in this country. His statement seemed to presuppose that racism is a ‘whites only’ problem, that it is somehow endemic to the caucasian character. I repeated his statement to several others, and a surprising (to me) number of people (not all of them black) agreed! Is this really how blacks see it?
User Detail :
Name : Michael W., Gender : M, Race : White/Caucasian, Age : 45, City : Chicago, State : IL, Country : United States, Education level : Over 4 Years of College, Social class : Lower middle class,