Thursday, December 22, 2011

When Liberals run America, America becomes more Liberal?

Two important precepts that you seem to have missed. first, there is nothing un American about "liberal". Second, Obama was elected by we, the people, to bring a more liberal view to our country and to repair the damage left behind by the failed conservative agenda. We knew about his planned health care reform and voted for it. You lost, now get used to it.

No comments:

Post a Comment