Racist Folks In America
Before I get into this post where I am giving you my opinions and beliefs about this subject matter I though it would be a good thing to see exactly what the dictionary had to say about the issue first so I copy pasted that definition next.
The belief that some races are inherently superior (physically, intellectually, or culturally) to others and therefore have a right to dominate them. In the United States, racism, particularly by whites against blacks, has created profound racial tension and conflict in virtually all aspects of American society. Until the breakthroughs achieved by the civil rights movement in the 1950s and 1960s, white domination over blacks was institutionalized and supported in all branches and levels of government, by denying blacks their civil rights and opportunities to participate in political, economic, and social communities.