I told my friends it's always the woman's fault. What do you feel about that now?
Why did you tell them it's always the woman's fault? You make it sound as if you are joking.
Because women are always wrong.
That's an extremely awful thing to say and you should never do it because it's sexist. It's not right to discriminate against women, or any other group of people based on who they are. You need to treat people as individuals.
I don't care.
The comments about women being always at fault are sexist indeed. Where did you learn this, was it from your upbringing? I really think you need to see a therapist in order to sort those feelings out, not to sound judgemental but I am concerned about you.