Primary education, on the other hand, is deep liberal indoctrination where you learn America is the first and greatest democracy, the Natives all mysteriously died for no reason, racism ended when we abolished slavery, and America is the hero of the free world.
If you were raised in the south, also that slavery was never really a big deal anyways and it was mostly a deal with the North like just being really unfair and mean.
I grew up in western public education, and I definitely learned poor treatment of indigenous, the varying ideologies, and ways of life that exist in the world, as well as the pre-cursors to western democracy. I'm not unique in this.
In no way, shape, or form is what you're saying the reality in most of North America, for quite a long time.
The caveat being, the southern states do have what you're talking about at a systemic level, but the ideas you're expressing being the norm in the majority of North America (the rest of the states, and Canada), haven't been the case for the past 40-50 years.
That doesn't mean there aren't deep systemic issues within our education system with the factors you bring up (indigenous peoples, democracy, and our "place" internationally, etc), it's just far more nuanced than whatever bullshit you're trying to sell.
So tired of seeing your rhetoric on here, dude...what's the deal?
Public school US history textbooks do not tell that America was founded on anti-democratic principles and was built from its foundation to suppress democracy, do not tell the story of how the US government hunted buffalo into extinction in the wild to starve the Plains Indians, do not tell about the systematic abduction of Native children to be raised by white families and erase Native heritage, do not tell the story of the hero John Brown who hunted down slave owners, do not teach about how the US turned on Ho Chi Minh when he reached out to America to help with his own country's war for independence, do not tell of how the US attacked Soviet Russia to aid the overthrow of the Bolsheviks during the Russian Revolution, and do in fact preach that systemic racism isn't real and racism is merely an individualist phenomenon where some people are racist but the United States is not a racist white settler state. You literally believe racism is over in most of America! Yet you want to pretend like you weren't taught that?
Some schools sometimes have some good teachers, but the majority are propaganda dispensaries. Structural racism exists and one of the ways structural racism expresses itself is in the public school system.
Read Lies My Teacher Told Me. It was written in 1995 and updated in 2007, there've been some improvements since then, but we have not solved the problem. The people who try to solve the problem are accused of teaching """Critical Race Theory""" and are publicly and personally smeared by the media and one of the only two ruling US parties.
I'm mean, I definitely learned some of this in high school. Yet I know people who went to the same school who believe completely different things than me. Despite learning the history of native Americans, black Americans, etc. It's almost like there are also other things at play... News media, social media, culture, family, lived experience, etc. Those thing also impact what people will believe. It's not like kids go to school and just believe everything they're told. Yes many schools didn't teach this, many schools did, it's not really the whole picture though.
I grew up in a southern state and still learned about all that shit too, btw. Neither their assumptions nor yours are correct. What is this, 1962? I have a huge tip for you about next November 22nd if so lol.