"It is the God given duty of the white men to civilize and christianize the primitive, under privileged and uncivilized population of the rest of the world;" this is a very common phrase used in the history books to explain the European intervention into other continents and island nations. The African continent was also a victim of British conquerance but the British called it its" protectorate." Did they really perform their duty or were they there to exploit the resources? There have been a lot of explanations about how Africa is perceived by the rest of the world, let alone the western world. To account for the whole world will be unrealistic and unrelated to this assignment so I will just focus on the western world. Africa has always been associated with words such as "primitive", barbaric", "savage" and "uncivilized." The negative portrayal has largely been a result of how western media covers the African news. .
Africa has always been referred to as a "Dark Continent." The history of Africa and its people is depicted in the western world as nothing but a self proclaimed tribal owned land, which had a lot of wealth that the primitive people had no idea how to utilize. Similar had been the history of other countries prior to European intervention, for example American Samoa, Fiji Islands, New Zealand and Australia; but the image of these countries is quite favorable. According to ABC's Ted Koppel, "half a million Ethiopians dying doesn't provoke the same response as would the deaths of half a million Italian's" (Hultman). Maybe Americans and Europeans are more concerned with countries where their economic interests lie. The colonists went to Africa, exploited their resources, took slaves and came back to their countries; but for the Africans the history of their country is not so simple. .