Is this how it’s always been? Seems like everyone I talk to that’s going through a divorce, that the woman is going crazy, thinks she deserves everything, and is willing to bury you to get it. I don’t get it. Why is it? Radical feminism? Societal norms going wrong? Protect yourself.