Politics confuses me…

from Salon.com, which is admittedly left-wing.

The right-wing crusade to demonize elites has paid off. Now the country’s run by incompetents who make mediocrity a job requirement and recruit from Pat Robertson’s law school. New rule: Now that liberals have taken back the word liberal, they also have to take back the word “elite.” By now you’ve heard the constant right-wing attacks on the “elite,” or as it’s otherwise known, “hating.” They’ve had it up to their red necks with the “elite media.” The “liberal elite.” Who may or may not be part of the “Washington elite.” A subset of the “East Coast elite.” Which is influenced by “the Hollywood elite.” So basically, unless you’re a shitkicker from Kansas, you’re with the terrorists.

I’m making no comment on the thrust of the article, or the question of “elite”ness, or even taking sides on the leftwing vs. rightwing issue. What’s confusing me is this: Didn’t it used to be Democrats, i.e., liberals, i.e., left-wingers, who were all for the “common man” and the shitkickers from Kansas? Wasn’t it Democrat FDR and his New Deal pseudo-socialism that was pushing for chickens in all the pots and all that sort of thing? Going back even further, wouldn’t it have been right-wingers who were all for the aristocratic elite, while the left-wingers were beheading them in the name of liberty, equality, and fraternity for all men? When did this all switch? I think this is why I hate politics so much. They keep changing what the words mean.

Heh…I’m categorizing this under “culture” and “philosophy” because I do not have a “politics” tag and refuse to create one. :)