I grew up in an impoverished, male-dominated country where sexism is rampant and completely accepted. The men didn't care because that's all they knew (not that that's any excuse). And the women just accepted it, because that's what women were supposed to do. Rapists? No big deal. The belief there is "boys will be boys" and that the men just can't help themselves (a superstition in the country is that if a man doesn't act upon every sexual urge he has, he will become infertile and impotent). But too bad for the woman--for the most part they're kicked out of the house and disowned. Because really...what whores. Prostitution? Perfectly okay. Domestic violence? Hey--she deserved it. (All sarcasm, obviously.)
Without saying too much about where I grew up or why my parents work overseas, the community I was raised in was very religious (Protestant). I was taught, from a very young age, that women should find a man to protect them, be subservient to them (arrgh, those damn Epistles >_<), and bear children. And stay at home. And fully accept this, because hey... "it wasn't women who ended WWII. It wasn't women who stopped apartheid" (paraphrased quote from John Eldredge's Wild At Heart: Discovering the Secret of a Man's Soul, which is a Christian devotional for men).
Thank goodness I grew up in a loving, equality-based home. My parents have gotten so much shit from the community they work with because 1) They're both feminists, and aren't scared to say it. 2) They voted for Obama (oh the horreur). 3) They believe in the evils of evolution. And 4) They've raised three children who don't fit the "cookie cutter Christian children" norm and are true to themselves. I'm an atheist and my one brother is transgendered (born physically female). And my parents love us and accept us. But that's a "bad" thing to do, apparently. They're such "bad Christians", apparently.
But obviously growing up among such social ills and around ignorant, backwards people who served as teachers left its mark on me, and on other females I knew. The vast majority of them just followed what they were taught their entire lives because they "know" that their purpose in life is to serve men.
It's disgusting. All of it. I grew up in a country where human trafficking runs rampant, and is mostly ignored by authorities. And let's not forget that the U.S. is not free from something that despicable. But the ignorance of it...it's enough to make anyone vomit. Because, you know, that kind of thing doesn't happen here in the U.S. Because, you know, we abolished slavery in the 19th century. Because, you know, that stuff only happens in Thailand, or Albania, or Russia. Because, you know, if it happened here, it would be stopped right away.
But some authorities here are not so different from the ones in the country I was raised in. They don't care, or they're bribed, or they themselves are "customers"--knowingly raping the terrified teenager in that cheap motel who is kept in someone's basement.
(Do note that I'm not saying all police officers are like this. Not at all. But some are.)
It makes me sick. All of this, all of it, needs to be stopped. I know I can't do everything, but I will (and am) doing something to try and end it.