I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
After World War I, the United States returned to limited isolationism primarily due to war disillusionment, economic concerns, and political opposition to international organizations like the League of Nations
Two things can be true at the same time.