Random political thought:
I’ve seen a lot of “leftists” vs. “liberals” stuff lately, and I wonder, since when did “liberals” become synonymous with specifically center-left democrats? I always thought “liberal” was just a general term that encompassed the entire left.
A lot of center-right and even straight-up rightwing people have been trying to make some distinction between “liberals” and people that are more socialistic as a way to co-opt the term liberal.
Just as the right has the alt-right, whose views are too extreme or racist for the typical conservative, the left has been overrun with wackos who have as much in common with their far-right “enemies” as they do with their own party (see: Horseshoe Theory).
Those who are left-leaning but have some views that are traditionally (or even more recently) seen as conservative often adopt the label of “classical liberal” to separate themselves from a left wing that is increasingly irrational and emotionally driven.