This is complicated, but follow me on this.
Religion in the West was at some of its most popular when it allowed men to beat their wives, take multiple wives, act out pedophilia, and commit rape.
As western religion embraced a more and more humane relationship between men and women, and pushed a more sexually tame post-marital life, men have drifted further and further away from religion.
So what I’m kind of thinking is maybe religion, like many elements of society, is valued by men significantly based upon how much leeway it gives them to abuse women.