Gallery: Let's Take A Look At The Evolution Of Feminism Throughout History
What feminism means to each individual woman (and man) may be nuanced but at its core, feminism is about equality. Each country and society has had its own own unique stamp on women taking control of the situation. In this article, we're looking back at the history of feminism here in the United States. Oh, it has been a wild ride. From being accused of being witches to still having to march in the 21st century for rights. Feminists have taken the US by storm with wave after wave, so let’s just see some of the highlights when it comes to the history of feminism.