I assume you're thinking of the "equal rights" brand of feminism that died out in the 80s. It's nothing but radicalized nonsense now.
Feminism in 2020 is nothing more than rewriting all of the laws and influencing all of that culture so that it only helps women, which invariably turns into only helping white women if they're the ones with the influence. Did you honestly know without looking it up that Affirmative Action was written to help black people get chosen for jobs and college, but it's gotten more white women in those spots than any other minority? It's true. Look it up.