Does anyone else get the feeling that America is in the midst of a political party realignment?
Now that everything that the (largely Republican) FBI does is labeled by the right as the ‘Deep State’, and the Republican congress and Executive branch has largely abandoned the traditions of those offices, it feels like Republicans have become deeply anti-traditional governance. Now Democrats are rushing to its defense, invoking the Constitution, giving governmental institutions the benefit of the doubt.
Similarly, the Democratic party is now more firmly the party of the educated elites, whereas the Republican party is courting more people who would otherwise be politically uninvolved.