Racism in America has existed since the colonial era and it involves laws, practices, and actions that are discriminated against various groups or otherwise unfavorably impacted them based on their race or nationality. Most white Americans enjoy legally or socially sanctioned rights and privileges that are denied to the members of other races or minority groups.
https://www.jonathanjbagwell.com/
Comments