The media has been notorious for denigrating, hyper-sexualizing and fragmenting womens' bodies to make what they believe to be more interesting films. In Hollywood the defamation of black women is two-prong, in the sense of their race and gender. Many may remember 2002, the history-making year when Halle berry won an Oscar for best actress. Halle Berry made history as not only the first black woman to win an Academy Award for "Best Actress" but also the first woman to do so having lost all of her dignity in one of the raunchiest and self-degrading sex-scenes of that year.
The mini-celebration that broke out when Berry won the award for her role in Monster's Ball came to a screeching halt when some came to realize that she had won the award for whoring herself, something hardly worth applauding. Even more telling was that the "who's who's" in Hollywood made perhaps the most obvious yet subliminal statement about how exactly it is they view black actresses. If you want to get ahead in this business (no pun intended), you have to take off your clothes. History was made alright...that day marked the day that we gave the highest accolade to someone who displayed one of the lowest forms of human behavior. let's not fail to mention that the Oscar award was started in 1927 and black actresses have shown up in films since the 1930's. Here it is in 2002 the first black actress to win an Oscar for a leading performance had to "double degrade" herself, but Can we really blame Halle Berry for misrepresenting black women in her film roles?