History

Was the Civil War or the New Deal more important in changing American attitudes towards the role of the government in national life?