Now that we've covered most of United States history leading up to the end of the 20th century, I'd like to consider whether or not people from the start of the 20th century would feel that the United States has done a good or bad job since then. People back then wanted the United States to be isolationist, but since the end of World War Two, the United States has become much more involved in global politics, to the point of becoming probably the most powerful country in the world. The United States has also had a lot of military interventions in other parts of the world, something that was unwanted around the start of the twentieth century. Many minorities now have rights that they didn't have before, which people 100 years ago might not have agreed with. However, the United States economy has grown greatly, with a much higher standard of living and lower poverty levels, shorter work weeks, and better conditions in many other aspects of life.
Do you think that somebody from 1917 who saw the United States today would think that the overall change has been positive or negative? Have our goals as a country changed a lot, or are they still similar to what they were 100 years ago?
I think the point you brought up about intervening in foreign affairs is troubling to me personally. I think that the United States may have benefited throughout the 20th century to interfere less with other countries and focus on domestic issues, especially since there were many economic crises. While I don't think complete isolationism is a good policy, the United States should have been more cautious about intervening as much.
ReplyDelete