Random political thought:
I’ve seen a lot of “leftists” vs. “liberals” stuff lately, and I wonder, since when did “liberals” become synonymous with specifically center-left democrats? I always thought “liberal” was just a general term that encompassed the entire left.
Funny, I saw this the other day on youtube https://www.youtube.com/watch?v=tlIjMJBSnRE