Imperialism means one country establishing an empire by conquering other countries, enslaving the inhabitants, and making them pay tribute to the ruling country. Various nations competed to acquire the best land and control the African people, whom they viewed as inferior. Europe is widely known to have dominated most of the entire world, mainly Africa, and established empires throughout the continent. The Europeans then converted the people they were dominating to Christianity, because they thought that the Africans were uncivilized and weren't practicing any religion or values. The pros…