Why are the words interpreted as pejorative these days? Further, it seems that even some people who proudly proclaim themselves liberal wouldn't also say they're feminists. It seems that now that a lot of overt discrimination has faded, and a generation has gone by since the women's rights and civil rights movements, there are some (many) women who now enjoy certain rights without having to be called a man-hating lesbian to exercise them. But just as the Geneva Conventions aren't "quaint," I don't think feminism is either. So why are cleaning products still marketed almost exclusively to women? Why are more and more women switching back to taking their husband's name? Why are most corporations still governed by white men? Unfortunately, although they may elicit eye-rolling, these questions are still relevant.