Yes, I'd agree with that.
This conversation arose from the assertion that, to fundamentally change a culture that ignores or condones the abuse and sexual harassment of women, it is not enough to just tell men to be "nice" and treat women with respect.
That is true.
It was proposed that conceding positions of power to women would be a more effective way to promote a change in culture.
Writing off affirmative action and policies that promote the gender designated advancement of women in the workplace, when these policies are in their infancy, is premature.
On an anecdotal level - I work in a company that up until recently had very few women employed let alone in senior positions - over the past 18 months there has been a big shift towards promoting diversity, the majority of new recruits are female and a few significant senior roles have been taken over by women.
As a woman it has made me fell less an outsider and generally happier in the work environment, it has also demonstrably improved the performance of project teams.