As I've mentioned in the post above, telegram simply does not have "secret chats" available on Desktop.
So, if you use Secret chat on mobile and go to desktop, you'll not see it!
Switching back and forth between secret and not secret is troublesome especially when the other side doesn't know if you are on desktop or mobile to check availability for secret chat.
I have just installed the Mac client and it allows to create new secret chat but it does not show the secret chats created before on your phone. At least that is what happened to me.
We’ve designed the Signal service to minimize the data we retain about Signal users, so the only information we can produce in response to a request like this is the date and time a user registered with Signal and the last date of a user’s connectivity to the Signal service.
Notably, things we don’t have stored include anything about a user’s contacts (such as the contacts themselves, a hash of the contacts, any other derivative contact information), anything about a user’s groups (such as how many groups a user is in, which groups a user is in, the membership lists of a user’s groups), or any records of who a user has been communicating with.
All message contents are end to end encrypted, so we don’t have that information either.
Ok, that's good. But Signal automatically created conversations with people in my address book who I never want to communicate with via Signal. I didn't authorize it, the app did it automatically. I think that is what the grandparent was referring to.
I stand corrected on the reproducible builds. That's great. It would be greater if they worked with F-Droid so I could download a verified build from there, instead of making me choose between Play Services or APK sideloading.
I don't buy any of the arguments on that SMS page. In particular, all the hand-wringing about metadata rings hollow when OWS has simply arranged it so that they have access to it all, instead of a federated network of telcos, making themselves a single point of failure. The only argument that rings true is "iOS does not have APIs that allow us to programatically send/receive SMS messages", but I don't feel I should have to suffer for Apple's failings.
I applaud the offering of APK downloads, which Moxie has historically resisted calling them "insecure" (I can see his perspective but when the alternative is giving Google root, I'm afraid we're going to have a difference of opinion on what constitutes "security"). Unfortunately, when using Signal without Play Services, as I understand it battery life is much worse due to the implementation. I appreciate that "push notifications" without a central service is a difficult technical problem to solve efficiently - this is a large part of why I regard the dropping of SMS support, which solved it, as a screwup.
Lastly - if I seem like I'm giving OWS a hard time unfairly, it's because of their repeated insistence on centralised services (both Google's and their own) and resistance to any sort of federation or anything that costs them control or information. I'm glad that the situation is (slowly) improving, but these are indicative not just of technical difficulties, but a deep ideological difference. I am not afraid to say that I am mistrustful of this One True Encryption solution when its authors seem so willing to compromise on fundamental issues.
I am also mistrustful because whenever a "controversial" decision is made, the real reason is always accompanied by copious FUD. The SMS thing is one example. The F-Droid affair is another (Stated reason: F-Droid is insecure. Real reason: F-Droid strips analytics).
While I disagree with their rationale, the person you were replying to was displeased with how Signal depends on Google Play Services for push notifications. Last I checked that was a dependency that Moxie felt was essential to providing a good UX.
> People with security, budget and privacy concern go for flip phones.
No. That ensures you can't send encrypted messages or do encrypted calls.
Also see one of the reasons Signal moved to sending encrypted messages as data and stopped supporting encrypted messages sent as sms.
> SMS and MMS are a security disaster. They leak all possible metadata 100% of the time to thousands of cellular carriers worldwide. It's common to think of SMS/MMS as being "offline" or "peer to peer," but the truth is that SMS/MMS messages are still processed by servers--the servers are just controlled by the telcos. We don't want the state-run telcos in Saudi, Iran, Bahrain, Belarus, China, Egypt, Cuba, USA, etc... to have direct access to the metadata of TextSecure users in those countries or anywhere else.
Apps routinely ask for many more permissions than they have reason to and users have been conditioned to just 'get it over with'. Technically you are right, in practice users hand over the keys to the kingdom without a moments pause to think of the implications.
Now, you could of course argue that they only have themselves to blame.
I'd argue that if someone wants to get a flip phone for privacy reasons they should be able to not download shady apps and give them permissions without thinking.
Flip phones have some of the best protections available: the sensors aren't there. You can't leak your location if there is no GPS module in your phone, you can't have your camera hacked if there is no camera and so on.
I'd prefer all this stuff came with physical switches so it can be enabled/disabled in a hack-proof manner.
You can't leak your location if there is no GPS module in your phone
While not as precise, you can definitively leak your location by scanning for the surrounding cell towers, especially in a city, which usually have hundreds or thousands of them (Manhattan alone has eleven, for example). I used to run a Python script on my Nokia phone that logged the tower ID, and I could reliable tell when I got to work, home, etc.
And that's just for people who control your phone. Your operator has U-TDOA¹, which is typically accurate to 50m.
Sure, but that's telcos and the local law enforcement. It's not google, facebook, 500 advertising networks and a whole pile of other parties.
It's also not accurate to within enough resolution start targeting advertising and other nuisance information at me even if there was a way to present me that (which there isn't).
I'm well aware of the power of triangulation, I used to go fox hunting.
Sorry, I didn't explain myself well. I'm just talking about the main towers, for each of those there are many smaller ones. Check out http://opencellid.org/ it's amazing, actually.
Corporations merging their databases. This is happening in real time, right now.
I don't have any illusions about being able to stay private from the eyes of nation state level adversaries but commercial entities can still be kept out if you try.
Signal has that too https://whispersystems.org/blog/disappearing-messages/ And using GCM is only a problem for people running a custom Android ROM without Google Play Services. They can use MicroG instead. For the vast majority of people who do have Google Play on their phone this is completely irrelevant. Using GCM doesn't make Signal less private.
> Google doesn't see any data via gcm, it's just a tickle. If you want push messages, you gotta use a push network.
> I've also seen first hand how difficult 3rd party clients can be on large networks with actual client logic, and unfortunately we simply don't have the resources to deal with that.
> I hope that everyone here who prioritizes federation above all else moves to federated products that support their goals, and I hope that those projects can demonstrate that I'm wrong about the inability to build competitive user experiences over the long term.
> If the only thing that the remaining people here want out of LibreSignal is a websocket-only solution and gmscore isn't an option for whatever reason, I would consider a clean, well written, and well tested PR for websocket-only support in Signal. I expect it to have high battery consumption and an unreliable user experience, but would be fine with it if it comes with a warning and only runs in the absence of play services. However, I also realize that still won't help people that are trying to build a Google-free experience on Google's platform, since we still don't have the things we need to be comfortable distributing software outside of Play.
> The thing is, Wire is developed by a for-profit company that has yet to discover a sustainable business model. They seem to be in a hurry to gain users, boasting about their own app's security and privacy before it has ever been independently audited.
> Using the Service to communicate by chat, our servers store your encrypted messages and other encrypted content and log other information such as the time and date of your conversations, and the other user or users with whom you are communicating. When using the Service to make or receive calls, our servers log and collect time and date of your calls, and the other user or users with whom you are communicating.
> Certain information (e.g. a recipient's identifier, an encrypted message body, etc.) is transmitted to us solely for the purpose of placing calls or transmitting messages. Unless otherwise stated below, this information is only kept as long as necessary to place each call or transmit each message, and is not used for any other purpose.
> This was put to the test in the "first half of 2016", when Signal's developers received their first subpoena. According to the documents that were published by the ACLU and OWS https://whispersystems.org/bigbrother/eastern-virginia-grand... , the Signal servers only store the number you register with (which can be anonymous https://yawnbox.com/index.php/2015/03/14/create-an-anonymous... ), the time you registered and the last time you connected to the Signal server (the precision of which is reduced to the day).
> The thing is, Wire is developed by a for-profit company that has yet to discover a sustainable business model. They seem to be in a hurry to gain users, boasting about their own app's security and privacy before it has ever been independently audited.
> Using the Service to communicate by chat, our servers store your encrypted messages and other encrypted content and log other information such as the time and date of your conversations, and the other user or users with whom you are communicating. When using the Service to make or receive calls, our servers log and collect time and date of your calls, and the other user or users with whom you are communicating.
> Certain information (e.g. a recipient's identifier, an encrypted message body, etc.) is transmitted to us solely for the purpose of placing calls or transmitting messages. Unless otherwise stated below, this information is only kept as long as necessary to place each call or transmit each message, and is not used for any other purpose.
> This was put to the test in the "first half of 2016", when Signal's developers received their first subpoena. According to the documents that were published by the ACLU and OWS https://whispersystems.org/bigbrother/eastern-virginia-grand... , the Signal servers only store the number you register with (which can be anonymous https://yawnbox.com/index.php/2015/03/14/create-an-anonymous... ), the time you registered and the last time you connected to the Signal server (the precision of which is reduced to the day).
At the end of the day Signal doesn't transfer messages realiably, which is also often repeated problem on the Google Play reviews. So I can't put my confidence in a messenger which can't reliably deliver instant messages. Not to mention using it without submitting to Google (Chrome "app") and which also happens to be OWS customer.
You can see if your message was sent to the server and if the message was sent to your friends phone. I haven't really had any problems delivering messages apart from one time when they had servers problems.
No need to use Chrome if you don't want to, Chromium also works.
> "Encryption works best if it's ubiquitous and automatic. The two forms of encryption you use most often -- https URLs on your browser, and the handset-to-tower link for your cell phone calls -- work so well because you don't even know they're there.
>
> Encryption should be enabled for everything by default, not a feature you turn on only if you're doing something you consider worth protecting.
>
> This is important. If we only use encryption when we're working with important data, then encryption signals that data's importance. If only dissidents use encryption in a country, that country's authorities have an easy way of identifying them. But if everyone uses it all of the time, encryption ceases to be a signal. No one can distinguish simple chatting from deeply private conversation. The government can't tell the dissidents from the rest of the population. Every time you use encryption, you're protecting someone who needs to use it to stay alive."
It's the least of Telegrams problems but let's not forget their home made crypto even though there are better alternatives. See the take-home message here:
> "We stress that this is a theoretical attack on the definition of security and we do not see any way of turning the attack into a full plaintext-recovery attack. At the same time, we see no reason why one should use a less secure encryption scheme when more secure (and at least as efficient) solutions exist.
>
> The take-home message (once again) is that well-studied, provably secure encryption schemes that achieve strong definitions of security (e.g., authenticated-encryption) are to be preferred to home-brewed encryption schemes."
> "Abstract: The number one rule for cryptography is never create your own crypto. Instant messaging application Telegram has disregarded this rule and decided to create an original message encryption protocol. In this work we have done a thorough crypt analysis of the encryption protocol and it's implementation. We look at the underlying cryptographic primitives and how they are combined to construct the protocol, and what vulnerabilities this has. We have found that Telegram does not check integrity of the padding applied prior to encryption, which lead us to come up with two novel attacks on Telegram. The first of these exploits the unchecked length of the padding, and the second exploits the unchecked padding contents. Both of these attacks break the basic notions of security, and are confirmed to work in practice. Lastly, a brief analysis of the similar application TextSecure is done, showing that by using well known primitives and a proper construction provable security is obtained. We conclude that Telegram should have opted for a more standard approach.
>
> Conclusion: TextSecure is based on strong primitives that have withstood crypt analysis from the crypto community for years, and these are combined in a way that proven provides authenticated encryption. Telegram on the other hand has crafted its own encryption scheme and deployed it in an unproven state, and prior to any scrutiny from other cryptographers. We have seen this done time and time again, and rarely with good results. Take for example the smart grid meters that were shown to use terrible crypto back in April this year. Furthermore, the DH Ratchet is a very nice way of providing forward secrecy on a per-message basis with little overhead, which is an improvement over Telegram's one key per 100 messages approach.
After seeing your comments in this thread and looking at your comment history, I have to ask: What is your affiliation with Signal/Open Whisper Systems, if any?
Just very enthusiastic about Signal then, I guess?
Often when someone is so outspoken about a product it's because they have a vested interest in its success (and they don't always disclose that fact)... that's why I asked. Thanks.
More enthusiastic about privacy and free software. I often see worse apps recommended for privacy reasons, so why not bring up the flaws in those and what's better about this. If something better comes along I'll switch to that and recommend that instead.
> What I can not comprehend is how respectable people and experts like Snowden and others from EFF can get behind a messenger that its authentication is based on cell phone numbers!
Authentication isn't based on cell phone numbers, that's just the identifier. See "verify security code" here: https://www.whatsapp.com/faq/en/general/28030015 The problem, which EFF does mention is that "if your contact changes keys, this fact is hidden away by default."
> When an application sends all your contacts to its servers (whether they are hashed or not) and more importantly when your whole access depends on a none encrypted code sent via SMS
Correct me if I'm wrong but it seems as if you think that someone who hijacks your number will get access to some account where all your contacts are. That's not the case. The problem here is the same as above.
> and worst of all, your identifier can be tied to your real identity extremely easy, how can they call it secure at all?
> It is not all about E2E or how the crypto is designed or implemented, its also about your anonymity, your social graph and other pieces of information which are arguably more important not to give away!
On the first point: Account authentication (when you setup your account or when you add a new device) is done via a non encrypted text message delivered to you by the tel-co service. This method is extremely insecure as it has been used by state and non-state sponsored hackers to hijack the account. IMHO the only reason a messaging service uses and relies on phone number to identify (and of course authenticate accounts) is to steal (that's how I see it) their contacts and force them to use the service in order to grow their user base. Such unethical and disturbing practice can not be endorsed by an organization like EFF.
Regarding the second point, as mentioned above, my problem is with the support EFF shows for such applications/corporations. If you are looking to avoid mass surveillance, of course the ability to be anonymous is critical.
> On the first point: Account authentication (when you setup your account or when you add a new device) is done via a non encrypted text message delivered to you by the tel-co service. This method is extremely insecure as it has been used by state and non-state sponsored hackers to hijack the account.
Again, the problem here is that "if your contact changes keys, this fact is hidden away by default." If WhatsApp did that by default, like Signal, then you would know that the key had changed.
> IMHO the only reason a messaging service uses and relies on phone number to identify (and of course authenticate accounts) is to steal (that's how I see it) their contacts and force them to use the service in order to grow their user base. Such unethical and disturbing practice can not be endorsed by an organization like EFF.
The phone number is used for contact discovery. You're not forced to do anything. For most people when they download a messenger they want to use it to talk to other people and they don't find it disturbing or unethical when that's possible.
> If you are looking to avoid mass surveillance, of course the ability to be anonymous is critical.
Luckily it's possible to use more than one app. I'm ok with my friends knowing who I am. This app makes it easy to find your friends. If you want to talk to people you don't know without them knowing who you are, there are other apps. That's not the purpose of this one. It doesn't make it bad, it doesn't make it insecure, it just means it's not for you.
1) Again I'm not talking about verification of whoever is on the other side of the conversation, its about hijacking the account (whether by breaking into the Tel-Co system or having access to it using a court order). There are other means to verify the person you are talking to (signing a message in the beginning of conversation using another app or software) but if all that it takes for someone to have access to my account is to get a copy of that text (containing authentication code) I'm not sure if anyone can call this secure. IMO this security flaw is far more important than having E2E. I hope I was able to differentiate between authentication and verification.
2) If you are using this app, you are forced to give up a copy of all you contacts and also the app is scanning for new contacts several times every hour! If this was an opt in option, I wouldn't have any issues with it. Some people might favor convenience over security as is their right but forcing a social graph of all your friends (almost in all cases without even a simple warning) out of you because you simply want to use the service is frankly disturbing.
3) Unluckily, there are no apps that have such strong E2E standard while implementing the points I raised.
What I'm more concerned about is EFF's bar to endorse a platform with such bug flaws.
> 1) Again I'm not talking about verification of whoever is on the other side of the conversation, its about hijacking the account (whether by breaking into the Tel-Co system or having access to it using a court order).
What do you imagine happens when someone hijacks "the account"? They don't get access to your past conversations, they don't get access to your contacts. All that happens is that they can impersonate you, which your friends will notice when they are notified that the key changed.
> If you are using this app, you are forced to give up a copy of all you contacts and also the app is scanning for new contacts several times every hour!
I'm pretty sure it asks you and you have to give it permission. And again, most people WANT to find their contacts. What's the point of having a messenger and no one to send your messages to?
> If this was an opt in option
It is opt-in, no one is forcing you to use WhatsApp. It's not like people don't know that they will be able to contact their friends through WhatsApp and are shocked and dismayed when they find out that's the case. You do realize not every app in existence has to follow your requirements right? You're free to use something that does, but the reason the majority use WhatsApp is that it doesn't. That's not a bug, it's a design choice that you happen to disagree with.