Has anyone else noticed that during our entire educational lives there's one central theme that's preached? Over the years, I've become increasingly skeptical and cynical when it comes to the role of the state and I look back on middle school and high school days and notice that the state is portrayed as "just looking out for the public good" and that while state had their faults in the past, we have a really robust democracy and any problems coming up due to corruption etc. can be quickly ironed out with voting blah blah blah as if somehow the violent and coercive nature of the state has changed. Even now that I'm in college, the same story is being pandered about by professors in their 40's that the state more or less owns us and that's the way it is and should be. And nobody speaks up or rejects these ideas in any way. I'd take one for the team, but something about social situations causes certain aspects of my memory to go haywire. Quite frankly, I am simply dumbfounded that we live in an age where religion and statism are still dominant forces in society. Am I alone on these observations? I also came across this video on reddit. 52 minutes long, but very entertaining. Even if you don't agree with everything this guy says, you'll get some good laughs out of it for sure. [ame=http://www.youtube.com/watch?v=064YTtSxVSo&feature=player_embedded]YouTube - Our Wise Overlords Are Just Here to Serve Us | Thomas E. Woods. Jr.[/ame]
At the risk of sounding like an asshole, that video was already posted in this thread. There are some other videos in there that may interest you. And you're not alone.
More attention needs to be brought to this excellent lecture by Mr. Woods, and OP, I agree completely. So much propaganda is drilled into our minds in public brainwashing... I mean education centers, it really speaks volumes about our Government.