The transformations of the United States' foreign policy during and after World War II allowed her participation in future foreign affairs and completely denounced her policy as a isolationist state. The United States broke through the barrier of being an isolationist state and dedicated itself completely to preserving the welfare of the rest of the world. Largely due to the Truman Doctrine, the United States would no longer stay in the Western Hemisphere and hide behind the Monroe Doctrine, but would now make it her business to guide all facets of the world down the "right" path of liberty and democracy. This responsibility which the United States put upon herself would cause controversy and debate in the years to come. Is it the United States' right to intervene with foreign affairs or should she take care of her own business? No matter what the correct answer is, America made the decision to aid the neglected and abused nations and accept the criticism she would most definitely be the target of.…