Steve Harvey Claims Hollywood Is More Racist Than America; Speaks on Kerry Washington’s Role8 Comments Steve Harvey confirms something must of us already knew. He has a daytime talk show, still hosts a popular TV game show, has best selling books that become box office hit movies, a popular self described relationship expert, and a very successful comedy career on his resume. In a recent piece about him in the Hollywood Reporter, Harvey lashed out at the unfairness he sees in Hollywood: “Hollywood is still very racist. Hollywood is more racist than America is. They put things on TV that they think the ma@$$@ will like. Well, the ma@$$@ have changed. The election of President Obama should prove that. And television should look entirely different. Kerry Washington should not be the first African-American female to head up a drama series in 40 years. In 40 years! That’s crazy.” The article went on to reveal exactly when Harvey’s eyes were first opened to this, though it was pretty obvious to everyone else: “… it was an education in the thinly veiled ghettoization of network television. At the time, he says, a high-ranking WB executive explained to him that new networks invest in shows starring African-Americans because they bring a guaranteed audience. “But as they build the network and get more eyeb***, they slowly start phasing them out,” explains Harvey, and the networks try to woo higher-income brackets with a less diverse slate of programming that is perceived as more palatable to the mainstream.” So do you agree with Harvey?