Outcomes of the Civil War
After the Civil War the United States began to think of themselves as a whole nation. Before the war, Americans tended to think of themselves simply as citizens of whatever state they were from, not the country. With this came the idea of a centralized government. This meant that all of the debt from war from both the north and the south was put together to create a whole bunch of money lost from the United States. Also, there was much damage to various areas where battles were fought. Most of the south was damaged from the war which led the U.S. to even more debt. Most southerners lost their livelihood from the war as well. When slaves were freed, plantation owners/slave owners couldn't produce as much as quickly as they had in the past as caused many plantation owners to loose a good bit of money. Cotton grown in the south never regained its dominance worldwide. Because of this, the south became the poorest part of the United States for a long time. Since many of the union soldiers were immigrants, the end of the war created better feelings from Americans towards immigrants. Women also seemed to have more opportunities open up for them. Since women did various jobs that men couldn't do while at war, they continued those jobs after the war was over. Most importantly however it stopped the slave system and must hostility towards African Americans. All slaves were freed and given almost all the rights that whites have.