Filter
Nordamerika
Filter
The American Civil War destroyed slavery in the South. At first, most white Americans denied what would eventually seem self-evident. But black Americans saw clearly that the …
The Civil War not only brought freedom to slaves, but it also brought military duty for many American blacks at the front lines of the warring armies. The drama of the military …