Commentary
According to mainstream media opinion pieces, women have been oppressed by men since the beginning of humanity. The 1960s were the first time, at least in Western civilization, that women started to inch out from under the patriarchal thumb. This narrative is persistent amongst leftists, but it completely disregards proof to the contrary in art and literature from the past.