Feminism is a commitment to eradicating the ideology of domination that permeates Western culture on various levels —sex, race, and class, to name a few—and a commitment to reorganizing US society, so that the self-development of people can take precedence over imperialism, economic expansion, and material desires.