I'm interested in hearing what women/mujeres on this site think about what's been termed lately (at least in the US) the "war on women" and the resumed fight among women for reproductive and sexual freedoms, equal pay in the workplace, and the general misogynistic tone we've been hearing. Do you think these issues have surfaced again, or were they always there and had never been resolved? Do you find the struggle never stopped where you live or in your country, or had women's lives improved in these arenas?