American society has changed since Donald Trump came on the political scene, and it’s not for the good. When Trump became President the heretofore hidden dregs of society were emboldened to come from underneath their rocks and into the mainstream culture. Among them were racists, bigots, Nazis, fascists, and plain…