Evidence from neuroscience and related fields suggests that language and thought processes operate in distinct networks in the human brain and that language is optimized for communication and not for complex thought.
Instead, language is a powerful tool for the transmission of cultural knowledge; it plausibly co-evolved with our thinking and reasoning capacities, and only reflects, rather than gives rise to, the signature sophistication of human cognition.
So, I guess, while the LLM approach to AI does do a good job mimicking the effects of our intelligence, it probably can't really recreate it on its own