America seems to think that, just because they were the global hegemon, that any rising power also seeks hegemony by military power.
Historically, this is supported by the post-Cold War context: the Warsaw Pact, NATO, and US enforcement of the Monroe Doctrine maintained American dominance in the West.
But then, the USSR collapsed. It's a new world, old man.
It’s the projection of hegemonic capitalist imperialism, that can’t see things through any other lens than capitalist imperialism. It’s so hegemonic that even most of the non-wealthy see it this way, be they “conservative,” “liberal,” or even “ultra-leftist”.