Signal isn't federated. Signal has centralized servers. Signal requires phone number identification to use it. Signal stores your encryption key on their servers.... Relying on sgx enclaves to 'protrct' your encryption key.
Signal can go down. Signal knows who you talk to, just by message timing. Signal knows how frequently you talk to someone. Signal can decrypt your traffic by attack their own sgx enclaves and extracting your encryption key.
These are all possible threats and capabilities. You have to decide what tradeoff makes sense to you. Fwiw I still use signal.
So my takeaways from this link and other critiques has been:
1.Signal doesn't upload your messages anywhere, but things like your contacts (e.g. people you know the usernane/identifier, but not phone number of) can get backed up online
One challenge has been that if we added support for something like usernames in Signal, those usernames wouldn’t get saved in your phone’s address book. Thus if you reinstalled Signal or got a new device, you would lose your entire social graph, because it’s not saved anywhere else.
2. You can disable this backup and fully avert this issue. (You'll lose registration lock if you do this.)
3. Short PINs should be considered breakable, and if you're on this subreddit you should probably use a relatively long password like BIP39 or some similar randomly assigned mnemonic.
If an attacker is able to dump the memory space of a running Signal SGX enclave, they’ll be able to expose secret seed values as well as user password hashes. With those values in hand, attackers can run a basic offline dictionary attack to recover the user’s backup keys and passphrase. The difficulty of completing this attack depends entirely on the strength of a user’s password. If it’s a BIP39 phrase, you’ll be fine. If it’s a 4-digit PIN, as strongly encouraged by the UI of the Signal app, you will not be.
4. SGX should probably also be considered breakable, although this does appear to be an effort to prevent data from leaking.
The various attacks against SGX are many and varied, but largely have a common cause: SGX is designed to provide virtualized execution of programs on a complex general-purpose processor, and said processors have a lot of weird and unexplored behavior. If an attacker can get the processor to misbehave, this will in turn undermine the security of SGX.
Signal asks users to set a pin/password which needs to be periodically reentered. This discourages people from using high entropy passwords like BIP38.
For the people who really care, they can disable The pin. I believe the client will generate a BIP 38 password randomly, and use that for the data encrypted in the SVR. But all the data is still uploaded to the cloud. So if there's a problem with the SVR encoding, the BIP 38 password generation etc the data is still exploited
Not only do you have to care, everyone you talk to has to do the same thing, because if your counterparty has their key in the cloud, the conversation is at risk.
Also, most of the points of the message you replied to are abstract and don't need any citation. Like do you want source for signal being centralized or for signal having ability to track you?
Everything in that post makes perfect sense; the proof is in knowing how these systems work, Signal's source code, and details from Signal themselves. I can go into more detail on each point when I'm at a computer; my phone kills processes in a few seconds when I try to multitask which makes it nearly impossible to write long posts on mobile if I have to go back and forth to copy and paste. Is there any claim in particular you want details on as to why it's reasonable, or shall I just do the lot? Edit: Ah, OP got it, nevermind!
Also, I should point out that I use Signal pretty much exclusively for messaging. This isn't hate, I'm just aware of its weaknesses.
They have your key In a SGX enclave. You only need to look at the rich history of side channel attacks, known SGX critical vulnerabilities, or just the fact that Intel can sign arbitrary code, which can run in the enclave, which means they can be compelled to with the cooperation of the government
I'm not saying they do, but they have the capability, which needs to be accounted for in your threat model.
At the end of the day, people are entrusting their encryption keys with the signal foundation to be stored in the cloud. That needs to be part of the threat model.
i read some of your other comments too. this is insane. I've always hated signal but this is another reason on top. No wonder the CIA funded them for 10 years.
Read the post by signal. Note the use of the word "plaintext".
we don’t have a plaintext record of your contacts, social graph, profile name, location, group memberships, groups titles, group avatars, group attributes, or who is messaging whom.
Whenever someone qualifies a statement like this, without clarifying, it's clear they're trying to obfuscate something.
I don't need to dig into the technical details to know it's not as secure as they like to present themselves.
Thanks. I didn't realize they were so disingenuous. This also explains why they stopped supporting SMS - it didn't transit their servers (they'd have to add code to capture SMS, which people would notice).
Saying something has the capabilities of a honeypot, is the correct thing to do when we're assessing our threat model.
Is it a honey pot? I don't know. It's unknowable. We have to acknowledge the the actual capabilities of the software as written and the data flows and the organizational realities.
My concern is people stay away from Signal in favor of unencrypted privacy nightmares. It happened with DDG a while back where I knew people who used Google because DDG had privacy issues. It sounds dumb but it is a true story.
Sure. I still encourage people to use signal. Most people don't have a threat model that makes the honey pot scenario a viable threat. In this thread we are talking about its downsides, which is healthy to do from time to time. Acknowledging capabilities is a good exercise.