Here's the problem. While it's absolutely true that reproductive gender roles are written into our biology, the definition of traditional gender roles goes far, far beyond that. You can say that reproduction is fundamental to humankind (even that has some work-arounds nowadays), but you can't say that things like 'men work, women raise children' is the defacto normal.
Perhaps more importantly, the idea that we somehow always have to come back to 'biology' is flawed. Biologically speaking, we are susceptible to a great many natural illnesses that should reduce our lifespan. However, we've developed technology to counteract that, such as penicillin or vaccinations. Now, no one is saying, "Well, we should do away with hospitals and medicine because diseases purging us is our natural default state".
So do you think we'll become this gender neutral society one day? I only see that possible through scientific methods.
I have no idea. I have no expertise on the subject. I don't really care, either, honestly. My gender is not so important to me that I feel the need to make it publicly known in order to feel secure. Like, it's only important to me and my partner.