The Boys is what always got me. It was "woke" from the first season, criticizing evangelicalism and corporatism, themes that were (and still are) major topics of debate today. As the show goes on, it tackles more ongoing issues like rainbow capitalism, police brutality, and now, rising fascism in America, and it's only woke now?
Granted, I'm a fucking idiot, but was anyone EVER rooting for homelander? He was an obvious piece of shit from the outset, and in my opinion, people from almost any political bent would have acknowledged that.
Like, I see this sentiment that "homelander was making fun of you the whole time and you didn't know it" meme, but I feel like it attacks a strawman that completely did not/does not exist.
I've never watched an episode but have a general idea of the premise. It seemed pretty obvious that Homelander wasn't the protagonist from the basic understanding I had of the show and the promotional material.
I had a friend go "yea I dont like how they just made Frenchie gay now" like dude, he is one of the fruitiest characters I've ever seen. And he has always been like that. I feel like anti LGBTQ+ people just dont have people skills and can't tell if someone is queer. That would explain why they keep insisting that back in the day noone way bi and all that.