I.
The United States is a post-colonial nation. It may not seem post-colonial – India, Kenya, South Africa… Those are the nations that come to mind when many think “post-colonial,” if you’re one to think about postcolonialism, that is.
But the States are nonetheless post-colonial. We too bear the racial, social, and economic wounds left by imperial despots. We too grapple with the legacies of colonial divisions. It’s not as if racism existed here indigenously. It’s a colonial import.
That’s what colonialism does: it segments people by social constructions like skin color, religion, gender – anything colonialism can use to turn people into subjects and create an idolized “ideal” that dominates a dehumanized “other.”
In this invasive way, colonialism distorts every civil interaction, warping history and identities to make its self-serving philosophies seem sane. Right and wrong, deserving and undeserving, legitimate and illegitimate – these concepts are manipulated to suit the oppressor’s needs, and the stink lingers long after the overlords are ousted.
“Imperialism leaves germs of rot,” postcolonial theorist Frantz Fanon wrote.
The only way to cleanse this grotesque, colonial canker is by “removing” it from our minds, hearts, and society. America never did that, and we’re still paying the price. The only way forward is through Truth and Reconciliation.
Continue reading