Gteetings! Now, to start off, I am not American and I don’t have much familiarity with playboy as a magazine or a company, the only reason I know of playboy is from movies and tv shows.
I have always thought that playboy was a chauvinistic magazine that objectified women and was owned by a not so nice man. So when I went to their site I was a little surpriced to find articles like this: The Manosphere Isn’t Even Having Fun or Losers Who Wear Smart Glasses Are Filming Women Without Consent
Can someone explain to me if I’ve been wrong about playboy or if they are actually progressive?


The other that write for playboy aren’t the same people that take the photos. It has a pretty long history of platforming progressive writers and is comfortable with publishing controversial positions.
I see what you did there