Sunday, February 28, 2010

The Treaty of Versailles (ended WWI)


The Treaty of Versailles was signed by Germany and the Allied powers on June 28, 1919. This was the peace treaty that officially ended World War I. The Treaty of Versailles had a very negative effect on Germany. The Treaty of Versailles required that Germany accept sole responsibility for causing the war. They also had to make reparations to members of the Allied forces. Germany suffered the loss of 10% of their land and their overseas colonies away and shared between the Allies. 12.5% of the German population found themselves living outside of the new German borders.

The condition of the German economy went downhill as a result of the produce and profit being sent to the Allies as reparation payments. So, the German economy was unfortunately unable to recover itself. The Germans were also excluded from the league of nations and were enforced to live under other people's rule. This caused them to feel discontent. I don't think this treaty was fair for the Germans because it wasn't completely their fault. Sure, they did some things that may have sparked the beginning of the war but to put all the blame on Germany is just wrong. Other countries were involved in the war.

It is possible that the Treaty of Versailles led to the rise of fascism and Adolf Hitler. Germany was a huge mess after the Treaty of Versailles and Hitler offered to do something constructive about the hurt pride of Germany and he offered a way to lead the country out of shame and out of their bad living conditions. The way Hitler spoke had impressed the Germans and they allowed him to help them through the difficult time. As time went by, Hitler kept rising to the top after taking control of Germany. Maybe this is the reason why historians say that the Treaty of Versailles led to the rise of Hitler.

No comments:

Post a Comment