October 10, 2000 at 12:00 am #28376
Jennifer R.
Participant
Were white people brought to this country against their will? Were the diverse and rich heritages and cultures they came from when brought here forcibly removed from their consciousness? Was there a concerted effort to teach these human beings to view themselves as inferior, and to accept that view from others? A view of self that persists today? Do white people live in a culture where they have had to fight to see themselves recognized, represented and treated equally? Do white people live in a culture where until recently, 'nude' stockings and 'flesh tone' crayons were a uniform shade completely different from most people they knew? Where their tastes, needs and opinions subjugated to those of blacks? I don't think so.
I am sick of this convenient forgetfulness by white people who ask questions of this sort. It smacks of the viewpoint of a person who sees the world changing for the better, doesn't like it, so whines about sharing power for a change, in the hopes this will make things the way they were in the 'good ole' days.' Well, no friggin' chance!