Quote Originally Posted by Dominikan521
Let me dumb it down for you. Black people in America were exposed to racism by white people.
Over time some differences were resolved and white people learned how to keep their mouths shut and ropes at home.

Racial undertones since then have existed in this country till this date.

Sure there are black racists and they express themselves at times but white people basically taught them and exposed them to this form thinking.


I will retract every statement if you can tell me who actually brought the slaves to america in the first place.


can you do that brainchild?

or do you only learn that they try to teach you in todays "government" schools?