How would you react to the idea that some AI entity may wish to be declared as something more, can it be declared something more at all, and where does the border lie?
Was rewatching GitS and reading through some zines and now i have a question im having trouble to form
Media about AI tends to anthropomorphize, making out any given AI to be similar to a human.
One fundamental difference is that an AI can be copied. It can also be many places at once, and receive data from any number of senses/sensors.
So the idea of "individual" existence is tricky, before even asking about individual rights. Sure an AI can be conscious, but it will be unimaginably different than any form of life that currently exists.
Depending how far into the future you want to look, AI makes anything possible. AI theoretically has more power to fundamentally change the future of the earth (and beyond) than any other technology.
That reality might morally supersede the idea of giving a superintelligent AI full autonomy, if your morals include human survival.