When Obama won the election a few weeks ago, a number of my friends were upset about his success because of his allowance of legalized abortion and other beliefs that could hurt the traditional family. I agree--I want to protect the family as the fundamental unit of society.
However, I pointed out that Obama's foreign policy would likely improve our international affairs. And since the President has so much power in foreign affairs, I voted for Obama. We would be able to decrease the United States's appearance in the world as such an arrogant nation. (And although the president's skin color has no bearing on how good of a president he will be, I am very happy that America has reached the point of being willing to elect a black man to the Presidency. We've come a long way from the colonial days.)
I was surprised by my friends' response. Some of them agreed; but several essentially said that our government's reputation in the world isn't really that important. I wondered where that idea came from. The United States is currently involved in multiple wars, and we have troops and military bases in many countries. We use our international trade laws to encourage certain countries to act the way we want them to. Our economy is thoroughly tied in with the economy of the rest of the world, through trade, oil reserves in other countries, outsourcing, and so forth. We have taken it upon ourselves to be a leader in international affairs for many years. We have started wars and we have ended wars. We have given billions upon billions in foreign aid. Everything our government does in foreign affairs affects millions of lives both at home and abroad.
What does our international reputation not affect?