Sometimes I hate watching the news. Every single day they report story after story about black people commiting crimes. They demonize black men and just isn't fair. People see the way the media portray us and they believe it! ITs not like white people don't commit crimes or get arrested. Sometimes I think that the news is just rascist as hell! I'm tired of seeing black faces onTV about things baby mama drama, drug selling, kidnapping, raping, stealing,and killing. I'm tired of being demonized by the fucking media. And I think its about time black people that WE stop givig them stories to tell. As long as we keep doing all these things to each other, things are just going to worse for us. African Americans need a new unbiased image. I mean it can't just be us breaking the law.