blueyedpoet
Refugee
Okay, I could come across here as the biggest arse in the world, and it is my goal not to. From my experiences, discussions, and readings, it seems as if many - if not most girls - love being physically dominated. PLEBAns, what are your thoughts? Is this true? Why? Why not? If it is true, is it just a psychological trait learned by a patriarchal society? Or, is it something deep within the female genetic make-up?