How World War II Changed America
Overview
The Second World War had a dramatic impact on almost every aspect of society, the economy, politics, entertainment, perceptions of the world and America’s place in it. Up until about 1940, most Americans believed that they could avoid direct involvement in the on-going wars in Asia and Europe. While U.S. sympathies lay with China and the western European states under German occupation, most people did not want to fight. Those attitudes began to change in late 1940; by the time of the attack on Pearl Harbor, on December 7, 1941, Americans concluded that to protect democracy, they would have to vanquish tyranny abroad.
In three years, the United States transformed itself into the world’s largest industrial economy, capable of supplying air, naval, and ground forces that could fight simultaneously in two immense battle fronts on opposite sides of the world. This transformation had long-lasting consequences for American society, politics, and economic strength, as well as its own perception of itself as a global power. In this lecture, Professor Hitchcock will examine some of the major changes–some welcome and some unwelcome–that came to America as it fought and won the world’s great and most costly war.
Recommended Reading:
The American People in World War II, by David M. Kennedy
Why the Allies Won, by Richard Overy
Forgotten: The Untold Story of D-Day’s Black Heroes, at Home and at War, by Linda Hervieux