Imperialism
What is Imperialism?
imperialism
Simple Definition of imperialism
: a policy or practice by which a country increases its power by gaining control over other areas of the world
: the effect that a powerful country or group of countries has in changing or influencing the way people live in other, poorer countries
Full Definition of imperialism
1: imperial government, authority, or system
2: the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas; broadly : the extension or imposition of power, authority, or influence <union imperialism>
Information Taken from: https://www.merriam-webster.com/dictionary/imperialism