Why is it that women are expected to show skin? For a woman to be noticed in our country it seems so important for her to reveal herself pysically in order to be successful. Women more and more are promoted for looks and body shape than for brains. Magazines, Televsion, and the Music Industry have made such an issue of a woman’s appearance that women are flocking to clinics to have their entire bodies changed. What are we portraying here in our country?
User Detail :
Name : Sue, Gender : F, Sexual Orientation : Straight, Race : White/Caucasian, Religion : Christian, Age : 35, City : Davison, State : MI, Country : United States, Occupation : Counseling, Education level : 2 Years of College, Social class : Middle class,