Sherry Darling
New Yorker
Some feminist/peace scholars think that, if women were in the highest positions of power, we could abolish war. (Gandhi, for one, thought this.) I don't think many could argue that history has been largely controlled by men, and that is has been violent. So I ask
1. Is war inevitable? Just a part of human nature?
2. If women had been in charge throughout history, would war have been?
SD
1. Is war inevitable? Just a part of human nature?
2. If women had been in charge throughout history, would war have been?
SD