I know racism and sexism still exists in America but I think we're taking the wrong approach to getting rid of it. We can keep enacting social program after social program like affirmative action but they only solve the superficial part of the issue. I believe we need focus more on changing people's perspectives in order to truly eradicate racism and sexism. If people accepted minorities and women, then there would be no need for any social policies because nobody would have to be told by the government to treat others fairly.