Over a century ago, Germany colonised what is now know as Namibia. In the years that followed, tens of thousands of Africans were killed and all opposition was crushed. After its defeat in World War I, Germany would go on to lose Namibia and all its other African colonies. As the world marks the centenary of WWI, our reporters took a closer look at Namibia’s history.
Visit our website:
http://www.france24.com
Like us on Facebook:
https://www.facebook.com/FRANCE24.English
Follow us on Twitter:
https://twitter.com/France24_en
Comments