First, the President never went on an apology tour. This is part of the Republican fiction. Further, how many of you have taught your children to apologize when they were wrong? What makes it any different for us as a country. You guys have your values all messed up. Since when did an apology become a bad thing? Was it the same time that you decided compromise was a bad word?