The Civil War resulted in several important political, economic, and social changes for the United States. Politically, the Union victory preserved the United States as one nation and led to the abolition of slavery. Economically, the war spurred industrialization and the growth of a national economy. Socially, it brought an end to slavery and established the rights of African Americans, though racism and discrimination continued.