1 month ago
This is a question I have thought about a lot, especially recently.
I am aware that most of this site's users are politically liberal, and I respect that. I do consider myself a general conservative, but at the same time, I do understand a lot of the positions that liberals have on some things and even agree with them on some, whether wholly or partially.
That said, however, the more I look at it, the more I seem to realize that the general viewpoint amongst liberal people is that conservatism is the scum of the earth and that they are all violent idiots. But is it?
I'd say no, personally. I do understand why this comes up. Our president's the most visible conservative on the planet right now, after all. But he and his supporters are just a small fraction of the Republican Party and American/world conservatism in general.
There are a number of conservatives--like myself--that still subscribe to general values of it and support those in power that agree, but aren't afraid to go against what they think. In addition, conservatism has given this country and the world a fair number of good things. They were the party that abolished slavery, gave the nation a powerful defense, has been the ruling party in difficult times of need, and much more.
I don't know. I just feel like it gets a bad rap. Discuss what you think, but please be civil, as is the norm for politics 'round these parts.