I grew up then, I get the sentiment, though rose-colored glasses are certainly in play.
Superficially at least, society kept a lid on bigots. We grew up watching All in the Family and The Jeffersons, explicitly anti-bigot shows. "I hate Illinois Nazis.", was a joke. Literal Nazis were a joke, unthinkable. No one would accuse a show having, or being completely about, black people as "woke". It felt like the nation was finally turning a corner.
Now we've had a blatantly racist President, and may again. Nazis walk the streets of Nashville with impunity. The Confederate flag has been hijacked and is now an explicitly racist symbol. FFS, the attacks on Harris would have killed any campaign. People may have thought like that, but no one said it out loud lest they be made a pariah. We shamed racists, and now they're all up in our face.
I'm sure we're in a better place, all in all, but what she's talking about is, "Who let these fuckers out to play in public?!"
Hasn’t the confederate flag always been explicitly about racism? It represents the people that wanted to leave the USA so they could continue their traditions of enslaving Africans, and maintain their states rights to enslave Africans. How has it ever not been racist?
I can't speak for the 70s, 80s, and 90s, but I would go as far to say that yes, America is more racist than it was in the '00s/early 2010s. Even beyond MAGA, it really does feel like the country is more segregated than it was when I was a kid (and I feel like mixed race people are often the only people who can see it.)
I’m seeing this more from the right wing lately. I’m guessing it means “trump turned out to be as bad as everyone said, but since we obviously can’t admit that or allow our ridiculously idiotic support of him be challenged openly, we’re going to just remind everyone that we’re ‘all Americans’, yeah that’s the ticket.”