Western colonialism did not die after the end of World War II when the West gave up its colonies. It merely changed to a more subtle form, which may prove more harmful to non-Western cultures in the long run.The expansion of Western culture has continued at an accelerated rate along with the denigration and decline [...]