The AI scans all those screenshots visually and tags them for search later so, for example, an artist could open a file they don't remember the location of from thousands of folders by typing text describing it. That's actually awesome. I imagine lots of people could come up with really useful ways to use something like that. I mean, if it wasn't an Orwellian nightmare.
Features like this can almost never be privacy-friendly because they're developed expressly to violate your privacy. The value it provides you , as cool as that could be, is just how it's sold.
I don't know about impossible. I could see this working on a Linux distro with a local model doing all the work and storing it encrypted locally. Buuuuuut, it still feels risky! That's a giant traunch of juicy, searchable data that just begs to be stolen.
To be fair to Microsoft, this was a local model too and encrypted (through Bitlocker). I just feel like the only way you could possibly even try to secure it would be to lock the user out of the data with some kind of separate storage and processing because anything the user can do can be done by malware run by the user. Even then, DRM and how it gets cracked has shown us that nothing like that is truly secure against motivated attackers. Since restricting a user's access like that won't happen and might not even be sufficient, it's just way too risky.
I can definitely see the utility in the feature, it's just that it, conceptually, is such a security risk that it's simply not worth it, even ignoring the data harvesting/storage penalty.
You enter a discussion and you need to refer to an article you know you've read but can't find? Now you can find it. You want a backpack and remember seeing one you liked but can't remember where you saw it? Ask it to show backpacks you looked at - great now you've tracked it down in seconds rather than spending half an hour.
But yeah, the security and privacy implications of this are so bad that it's really not worth the tradeoff.