Idk. I feel like at some point we have to blame society as a whole for things or things will never get changed. Social change in the past has been hard fought to get public perception to change. When my grandparents were kids, for example, black people were not allowed to use the same facilities as white people and a majority of people supported this. Sure powerful interests in the media promoted this view, but ultimately the majority was wrong and had to change.
Blaming society seems like a pointless exercise to me. It doesn't help solve any problems, and even could make some people give up on trying to do better. Society can change pretty quickly if people make an actual effort to do so and education people to new ideas and realizations.