The news usually tries to portray black people as criminals and not as heros or law defenders. The news wants people to see the bad in black people and become scared of them. When the news frames their stories in certain ways without actually telling the truth in whole. I feel like it is up to the movies and television shows to portray blacks in a positive light in order to ensure that black people are not bad people but are instead like everyone else. Although television shows and movies are still flawed in the ways they portray black people because they want them to be seen as white or there for the white people's benefit. It is still not as harsh as the news who makes them look completely flawed.
Although television is still flawed with the ways they show black people, television portrays them as a normal family. An example of this being The Cosby's. Even though there is a problem with The Cosby's being seen as white and easy for everyone to reach the American Dream, when in actuality it is not, is still a better portrayal than what the news does. When blacks are portrayed as normal it allows people of all races to relate to them and they do not have that stereotypical image that all black people are bad people. They are able to see that black people are just like them and it slowly takes away from that schema they made in their head.
Blacks in movies has changed dramatically since the past. Now we see blacks in roles that would have never been granted to them in the past like playing God, a CEO, or the top agent. It is still seen that blacks are the "magical negro" and there to help the white person when they fall down. They are there to give the white person moral support when they need it the most. There is still racism towards blacks but i feel entertainment industries try to portray black people as white or white figures to down play what the news says about them. This is because the news are overtly racist towards blacks when showing them on television.
It is unfair altogether how black people are portrayed because the news only sees them as deviant, television tries to picture black people as normal but whites do not see it that way, and movies still enforce blacks to be inferior to the white lead, why is this a constant thing in media today in the new millennia?
Do you think we will ever get past this stage of media or do you think that it will just stay the same as time goes on?
No comments:
Post a Comment