Democrats, liberals, and left-wingers have openly embraced the concept that the United States of America is a country founded on racism and white supremacy. It’s been this way for decades; this is not a secret. Read More in Opinion.   Read More in Red Pill NOWlej.