What is Colonialism?
Colonialism is defined as the "control by one power over a dependent area or people."
In other words, Colonialism is actually a forceful invasion and taking control of another country and claiming its land as its own. Colonialism refers to the practice of forming colonies on foreign land, away from one's place of origin. Referring to the British rule in India, Colonialism led to Britishers settling in India with the aim of controlling them and exploiting the natural resources.
Colonialism, therefore, tends to bring social, cultural, economic, and political changes as well.