On the security of WhatsApp and Telegram
Many people, even among security experts and privacy advocates, hold the firm belief that WhatsApp is more secure and privacy-wise better than Telegram. After having thoroughly studied the issue, I do not believe this to be true. In this article I will try to highlight the necessary facts to enable the reader to form a more informed opinion on the matter.
Contents
Prologue
Most sources which praise WhatsApp and criticize Telegram make bold claims, presenting them as objective truth, without sufficiently motivating them or backing them up with facts. In many cases they rely on arguments from authority:
[Telegram] By default, it is less safe than @WhatsApp, which makes [it] dangerous for non-experts. — Ed. Snowden (Sep, 2016)
I realize that many of you will be sceptical now. After all, if Edward Snowden said so, it must be true. Why would you question it? And why, above all, should we believe you instead?
Well, the point is exactly this. I am not asking you to believe me, but to evaluate the facts, with an open mind, before forming an opinion. Moreover, even if it might be hard to accept, even heroes and geniuses like Edward Snowden can be wrong from time to time!
I made a good faith attempt to research all the facts and present them as objectively as possible, without including personal biases. However, please keep in mind that I do not consider myself infallible, and thus there might be some mistakes. Feel free to let me know what you think.
The fallacies
The vast majority of things I read on this topic can be reduced to a combination of the following factors:
- Not considering a threat model.
- The false premise that end-to-end encryption alone is a necessary and sufficient condition for good security and privacy.
- The erroneous conclusion that everything that is not end-to-end encrypted is inherently less secure and must be avoided at all cost.
- The huge respect for Moxie Marlinspike, one of the authors of the end-to-end encryption protocol used by WhatsApp and Signal, leading to arguments from authority.
- Outright lies about the insecurity of the cryptographic protocol used by Telegram.
The rest of the article is dedicated to amply clarify each of the points above. In section 3 I will articulate the importance of considering threat models when trying to determine if a system is secure. In section 4 I will discuss end-to-end encryption, the challenges that it entails and how it is implemented in WhatsApp and Telegram. In section 5 I will address the most common criticisms of Telegram, and finally, in section 6, I will draw some conclusions.
Threat modeling
Threat modeling answers questions like “Where am I most vulnerable to attack?”, “What are the most relevant threats?”, and “What do I need to do to safeguard against these threats?”. — Wikipedia
Searching on Hacker News for “WhatsApp is more secure” yields some typical comments from people arguing that WhatsApp is more secure than Telegram
Telegram is more fun (bots and stickers), but WhatsApp is more secure (no messages on server, no rolled-your-own-crypto). — hn
Or even that Telegram is the least secure
How can anyone call “Telegram” secure? Those days are over. It’s the least secure messaging app of them all now. Even WhatsApp is more secure than Telegram (let alone Threema). — hn
But, what does it really mean that WhatsApp is more secure? Secure against the government? Secure against your friendly neighbourhood hacker snooping on the Wi-Fi at Starbucks? Secure against throwing rocks at your phone? Secure against your mom trying to read your messages?
Defending from an adversary with practically unlimited budget and computational resources, such as a powerful government, is going to be much harder than defending against a curious neighbour sniffing on your Wi-Fi.
Saying that WhatsApp is more secure than Telegram, without specifying against what kind of adversary or against what kind of threat, does not mean much.
Without considering these questions and only craving for “more security”, is not necessarily a smart thing either. In the same way that putting your possessions in a nuclear bunker might be more secure, in case of a nuclear war, than locking them in a small safe in your room, but could not be the best choice if all you are trying to do is protect your Blu-ray collection from your flatmate! Moreover, increasing security is often not free, because it increases complexity, thus development cost, and can decrease usability. Exactly like a nuclear bunker is going to be much more expensive than a small safe and is also not going to be as easy to use. Lastly, it is important to note that you are as secure as the weakest link in your system. Therefore, if you want a nuclear bunker, with all the challenges and costs that it entails, but then forget to put a lock on its door, it might not protect you that much after all.
Consequently, let us now try to examine which of the components involved in using a messaging app can be potentially attacked by a malicious agent, compromising our security. For each of these, you need to establish if you blindly trust it to function correctly, or if you believe it could be compromised and thus need to find a way to defend it from the potential threat.
- The companies running the servers needed by the app to work. That is,
Facebook, Telegram and whatever other third party services they decide to
use or share your data with.
- Are you OK with them being able to read your messages?
- Do you trust them on keeping your data safe? So that it does not get stolen, for example.
- Are you OK with them selling your data or meta-data to advertisers?
- The app itself. That is, the WhatsApp or Telegram apps on your
phone.
- Do you trust that it does exactly what it says and is not malicious? For example that it is always encrypting your messages.
- Do you trust its developers to be good citizens and not insert backdoors?
- The communication medium. That is, the Internet.
- Do you trust the connection between you and the app’s servers to be secure?
- Do you trust your ISP?
- The distribution and update process. For example, the Play Store, App Store or F-Droid. Do you trust Google and Apple to give you the real app, and not a specially crafted one to spy on you?
- The other apps on your phone. Do you trust all the other apps on your phone, or do you think some of them might be malware?
- The OS. That is, Android or iOS. Do you trust your operative system and its developers (e.g. Google, Samsung, Apple)? For example, do you trust that no malicious actor can remotely install or uninstall software on your device?
- The firmware and the hardware. Do you trust your phone is not running malware at the firmware or hardware level, like it was discovered on Samsung phones in 2014?
Whew, that was quite a list!
But, do we really need to care about all of this stuff? Well, not many people do. However, if you want to evaluate which messaging app best suits your needs and you care a bit about security and privacy, then it is essential. This process will enable you to decide for yourself, without basing your decision on some tweet saying that one app is more secure than the other.
Nonetheless, it is important to reiterate that if you are worried that a powerful adversary, like a government, might want to directly target you (as opposed to compromise you as part of a mass surveillance program), you cannot defend only against one attack vector and ignore the others. You can have the most secure messaging app in the world, but if your phone can be hacked with an SMS, there is not much of a difference.
You might wonder why couldn’t we just try to defend ourselves against all possible threats and attacks. It is because, as we said before and as we will show in more detail in the next section, increasing security is not free but comes with the cost of increased complexity and very often decreased usability.
End-to-end encryption
Let us now address the most common reason that induces people to believe that WhatsApp is a better choice: end-to-end encryption.
WhatsApp describes end-to-end encryption (or E2EE for short) in the following way:
End-to-end encryption ensures only you and the person you’re communicating with can read or listen to what is sent, and nobody in between, not even WhatsApp. This is because with end-to-end encryption, your messages are secured with a lock, and only the recipient and you have the special key needed to unlock and read them. All of this happens automatically: no need to turn on any special settings to secure your messages. — WhatsApp FAQ
WhatsApp nowadays has end-to-end encryption enabled by default for all chats, while Telegram has not enabled it by default and does not support it on group chats.
This is undoubtedly a very nice property to have. If implemented correctly it allows us to communicate securely, even in the case in which the server (i.e. WhatsApp or Telegram) cannot be trusted and is considered malicious. However, if we decide to go down this road and not trust any more the companies running our app, we introduce a series of new complications.
However, one could argue that even in the case in which we decide to trust the Service, it is beneficial to have end-to-end encryption. This is a valid point because, while we might trust the Service, for sure we do not trust some potential malicious actors compromising the Service’s infrastructure and getting access to our data. In this scenario our data should still be safe because not even the Service has access to it, and therefore an attacker cannot steal the decryption key nor coerce the Service to disclose it. This is the major strength of E2EE. Nonetheless, it is worth to stress that in this scenario we assumed to trust the Service on having implemented E2EE correctly, even though we cannot verify it ourselves, and if that assumption fails to hold in practice, we lose all our guarantees.
In the rest of the section I will highlight some of the issues that need to be addressed when implementing end-to-end encryption correctly and the corresponding usability trade-offs. At the end it should be clear beyond any reasonable doubt that it is impossible to consider WhatsApp’s end-to-end encryption secure if we do not blindly trust Facebook first.
To see this, remember that we are assuming that the Service (i.e. WhatsApp or Telegram) is not to be trusted with access to our messages. For this reason we decide to use end-to-end encryption. We want to make sure that they really cannot read those messages, we cannot just take their word for it.
Now consider the following scenarios.
Trust in the app
How do we know that the app installed in our phone is really using end-to-end encryption? Of course WhatsApp and Telegram are promising us that they really are good boys. But remember: this time we decided we would not just take their word for it. What if every once in a while the app sent to the Service all our chats unencrypted? How would we know?
This means that we need a way to independently audit the app. A way in which we could increase our confidence that the app is not malicious is if it was audited by a third party, or even better if it was also developed in the open as a free and open-source project, hopefully with the active involvement of the community and using well-tested libraries as its backbone.
An other possible solution is to simply not use the official app, and use instead a third party client that we can trust more.
Unfortunately WhatsApp is proprietary software and its terms of service forbid to reverse engineer it. Moreover, Facebook actively fights against any third party clients. Telegram instead has free and open-source clients and even supports reproducible builds which can increase the confidence that the app you are running is really built from the published code and not just some malware.
Secure backups
As we already discussed previously, enhanced security often leads to usability challenges. Secure handling of backups is a perfect example of this principle.
We decided to use end-to-end encryption because we wanted to make sure that only the intended recipient can read our messages. However, end-to-end encryption only protects our messages in transit. What happens when we receive a message? What if some of our contacts are storing a copy of their messages unencrypted somewhere on the cloud? That is bad: we started from the assumption of not wanting to trust the Service with access to our messages, and now we are forced to trust whatever external backup service (e.g. Google Drive, iCloud) our contacts decide to use!
One of the benefits of having our messages securely stored encrypted with a key that is known only to you and the recipient, is that even if some attacker obtains those encrypted messages, there is no way they can read them. Not even by forcing the Service to disclose the encryption keys, because the Service does not have access to the keys. However, this guarantee is lost if an attacker can steal, or in the case of a government, force the disclosure of, our unencrypted backups from Google Drive or iCloud.
Unfortunately, backups made with WhatsApp are only partially encrypted. Messages are encrypted, while images, videos, documents and audio files are not. You can verify this yourself. Moreover, WhatsApp must have access to the decryption key, because it allows restoring backups without prompting for a password. You can opt out, but since backups are a very important feature most people have them enabled, for the simple reason that if you opt out and then lose your phone, your messages are lost forever.
Telegram, on the other hand, does not allow backups of secret chats (i.e. their name for end-to-end encrypted conversations).
Authentication
End-to-end encryption requires messages to be encrypted with the recipient public key. This ensures that only the corresponding private key possessed by the recipient can decrypt them.
However, how do we obtain the recipient’s public key? And how do we make sure it is the correct one? It is good to know that our messages are protected against potential eavesdroppers, however it is not much use if we are sending them to the wrong person!
The correct way to solve this problem is to verify each other’s public keys using a different secure communication channel. For example, you should meet in person with each and every contact that you care about and scan their qr-code to authenticate them, or maybe more simply through a normal phone call. If you are not doing this, you are simply trusting the Service to provide you the correct public key, completely defeating our initial goal.
An other important detail is that public keys can change, for example when you change your phone, requiring you to repeat the manual authentication process all over again. Any application that implements end-to-end encryption seriously must notify you when one of your contacts’ public key changes, as it means that you are not sure anymore who you are talking to. It could just mean that your contact bought a new phone, or maybe that your contact’s phone has been stolen and somebody is impersonating them! You have no way of knowing without first repeating the manual authentication process.
Unfortunately, WhatsApp does not show any notification on key changes, unless you manually turn on “Show Security Notifications” in the settings, and even then the message that is displayed does not give any hint of the possible security threat. This is in apparent contradiction with their FAQs, which claim there is “no need to turn on any special settings to secure your messages”. Moreover, even with that setting enabled, WhatsApp forces the client to automatically re-send undelivered messages and re-encrypt them with the new key.
Telegram does not handle this perfectly either. Secret chats are tied to an user’s public key. Therefore, if an user changes key, a new secret chat needs to be manually started. The problem is that Telegram does not notify you that the old chat is now useless. If you do not know that the person you are chatting with changed key, you are going to send messages encrypted with the wrong key and nobody is going to receive them. Although this behaviour does not allow to unknowingly send messages to an attacker, unlike WhatsApp, it is a rather severe usability deficiency.
Device synchronization
An other trade-off between security and usability is evident in the case of device synchronization. If the messages are encrypted with a key that is unique for each device, it becomes difficult to let users access their conversations on multiple devices at the same time. How can you decrypt a message on your computer if the required private key is stored on your phone?
The solution that WhatsApp adopted to this problem in their WhatsApp Web client is to require the phone to be connected to the same local network as the WhatsApp Web client. All the traffic is then routed through the phone, which in turn performs the encryption/decryption of messages. Therefore, WhatsApp Web is not a real standalone client as it cannot work without the phone. The drawback is that you cannot use WhatsApp Web, for example, if your phone is discharged. Even worse, your phone becomes a single point of failure, because if lost or stolen you will not be able to access your account until you manage to obtain a new copy of your SIM card. Moreover, being a JavaScript web application, WhatsApp Web requires that, on every use, you trust Facebook not to inject malicious JavaScript. There is no way of using it without trusting Facebook.
Not having end-to-end encryption by default, while requiring trust in the Service, makes life much easier on the usability side for Telegram. There are real, standalone clients available for every platform and device synchronization is seamless. In case you lose your phone, you can still access your account as long as you were logged in on a second device, such as your computer.
Criticism of Telegram
We talked about the challenges introduced by E2EE and how WhatsApp mainly fails to address them properly. Let us now talk about the main issues that people have with Telegram’s security.
No end-to-end encryption by default
A popular criticism of Telegram, and the main reason it is considered less secure than WhatsApp, is that it does not support end-to-end encryption by default.
We already talked, in section 3, about how arguing that something is “more secure” without considering threat models is not very meaningful. We also discussed, in section 4, the non-trivial challenges that arise when implementing end-to-end encryption, and how WhatsApp fails to address them.
However, we could wonder why Telegram didn’t come up with a solution to those challenges that would allow them to enable end-to-end encryption by default without sacrificing any usability.
The answer is that nobody knows how to accomplish this, yet.
There are messaging apps that correctly implement end-to-end encryption, without requiring the user to trust the Service, and also try to address the mentioned usability challenges, such as Signal, Keybase, Element, etc. However, it is not controversial to state that their usability is not yet comparable to that of Telegram (even though they are constantly improving and hopefully will get there soon). For example, Signal lacks, at the moment, standalone desktop clients.
Telegram decided not to make any compromises on usability, requiring users to trust them on keeping their data safe. Advanced users, which need end-to-end encryption and are willing to trade a bit of usability for improved security guarantees, are able to do so by using secret chats. This choice is very often heavily criticized by tech-savvy users, who prioritize security well over everything else, including usability. However, what is right for one does not have to be right for everyone. In fact, the popularity of Telegram shows that many are happy with this choice.
Pavel Durov, one of the founders of Telegram, further defends his choice of not enabling end-to-end encryption by default, by describing their “distributed cross-jurisdictional encrypted cloud storage” infrastructure and claiming that, to this day, they “have disclosed 0 bytes of user data to third parties, including governments”. These are all positive things, however, it is important to remind the reader that there is no way, to my knowledge, of verifying these claims, without trusting Telegram.
They rolled their own crypto
As soon as Telegram was first released they have been immediately criticized for rolling their own crypto and for making some non-standard choices in their protocol design.
[..] they’ve made some extremely unusual protocol choices that they need to publicly justify rather than simply describing in an API doc. [..] — Moxie Marlinspike (Dec, 2013)
Developing a new cryptographic protocol, even more so by someone that is not a cryptographer, is generally considered a bad idea.
Anyone can design a cipher that he himself cannot break. This is why you should uniformly distrust amateur cryptography, and why you should only use published algorithms that have withstood broad cryptanalysis. All cryptographers know this, but non-cryptographers do not. And this is why we repeatedly see bad amateur cryptography in fielded systems. — Bruce Schneier (May 2015)
This is something that makes sense, especially if there is already an established protocol that suits your needs. However, it does not mean there cannot be exceptions to this rule, otherwise we would never have new protocols.
The Telegram team believed they had valid reasons to create their own protocol, as you can read in one of the replies they gave to the critiques above:
Still, there are sometimes valid reasons for not re-using existing solutions. In our case, we needed something that is both secure and competitive in comparison to mass market solutions in terms of speed, working on weak connections and usability. — Telegram (Dec, 2013)
Moreover, Pavel Durov himself, explicitly voiced his scepticism towards “best practices” introduced by the crypto community, which can be interpreted as an other of Telegram’s motivation to implement a new protocol:
[..] I don’t see anything wrong with different teams trying different approaches. [..] What if some of the common “best practices” are intentionally promoted in the crypto-community as the best ones exactly because they contain flaws and backdoors? — Pavel Durov (Dec, 2013)
This may seem a bit extreme, however keep in mind that this was 2013, the year of Snowden’s revelations, which confirmed that this happened before, and it’s not unreasonable to believe that it could happen again.
Finally, Durov also tried to reassure the public about the skills of their team, defending its members from the accusation of being amateurs or a bunch of randoms:
The team behind Telegram, led by Nikolai Durov, consists of six ACM champions, half of them Ph.Ds in math. It took them about two years to roll out the current version of MTProto. Names and degrees may indeed not mean as much in some fields as they do in others, but this protocol is the result of thougtful and prolonged work of professionals.
Ultimately, everyone has a different opinion regarding the fact that it would have been better for Telegram to use some already existing protocol, or at least design their own making more conventional choices. It is impossible to predict now what would have happened had they made a different choice. The fact remains that they did roll their own unconventional protocol despite the crypto community ridiculing them as amateurs and labelling the protocol as insecure.
However, Telegram was launched 8 years ago. Enough time has passed to give us a chance to stop an endless debate based on opinions, and evaluate the facts. Was this protocol secure like Telegram stated, or was it something completely insecure that should never be used like many experts claimed? And what about its present-day security?
Let us find out.
History of Telegram vulnerabilities
This section mentions a few vulnerabilities found in old versions of Telegram’s protocol. However, it is worth noting that the second version of the protocol has been recently audited and its correctness formally proven.
(Dec 21, 2013) Telegram protocol defeated
Telegram was initially launched between August (iOS) and October (Android) 2013, and the first (and only) serious vulnerability was found just two months later.
A Russian IT-community user discovered that the modified version of the Diffie-Hellman key exchange used by Telegram’s secret chats, which added a nonce generated by the server, allowed the server to perform a Man-in-the-middle attack.
Telegram awarded the developer that found the vulnerability $100,000 and quickly fixed the issue. They also motivated their changes to the standard DH algorithm as follows:
These nonce numbers were introduced to add more randomness to the secret chat keys, mostly because of possible undiscovered vulnerabilities of the random generators on mobile devices (for example, one such vulnerability was found this August in android phones). — The Telegram Team
This was, as far as I am aware, the only real vulnerability ever found on Telegram. The rest of the section reports less severe nonpractical vulnerabilities, or non-vulnerabilities that are often used to support the claim that Telegram is insecure, and can thus be safely skipped if one does not care about such details.
(Jan 9, 2015) A 2⁶⁴ Attack On Telegram
It was reported that someone with access to Telegram’s servers could perform a Man-in-the-middle attack requiring 2⁶⁴ computations.
The original authors estimated the cost of the attack in the tens of millions of USD. Telegram estimated an even higher cost:
Our calculations show that for this kind of attack to succeed in one month — even with all possible optimizations using cycle finding, etc. — the attacker would need equipment worth around a trillion dollars. And the attack would consume approximately 50 billion kW.h of electricity in the process, worth hundred millions of dollars. All of this for just one secret chat. Deployment of FPGA and ASICs might lower this estimate by an order of magnitude, but the attack would still remain infeasible. — The Telegram Team
Recent versions of Telegram use 288-bit fingerprints rendering this attack completely impossible.
(Feb 23, 2015) Telegram App Store Secret-Chat Messages in Plain-Text Database
Zimperium reported that if a user’s phone is compromised and then a kernel exploit is used to gain root privileges, then Telegram’s secret chats can be read from memory or from its private storage.
This seems totally obvious. If your kernel is compromised, there is no secure messaging app that is able to protect you.
(Dec 8, 2015) On the CCA (in)security of MTProto
It was reported that MTProto 1.0 was not IND-CCA secure. The authors state that there is no practical attack:
We stress that this is a theoretical attack on the definition of security and we do not see any way of turning the attack into a full plaintext-recovery attack.
Telegram clarified that this is not a practical vulnerability in their FAQs. Moreover, “MTProto 2.0 satisfies the conditions for indistinguishability under chosen ciphertext attack (IND-CCA)”.
(Aug, 2016) Iranian hackers
Reuters reported that Iranian hackers have compromised some Telegram accounts by intercepting SMS-verification codes and managed to confirm the existence of Telegram accounts for 15 million Iranian phone numbers.
Attacks relying on intercepting SMS-verification codes are a known threat to all messaging apps which rely on phone numbers. On Telegram they can be avoided by enabling 2-Step Verification.
Regarding the identification of accounts, Telegram stated that it is known that any party can check whether a phone number is registered in the system. They also introduced a limitation in their APIs to avoid such mass checks.
(Mar, 2020) 42 million Iranian user IDs and phone numbers leaked
The official Telegram app is blocked in Iran, so millions of users installed an unofficial client. This is unfortunate. However, it is not a Telegram’s vulnerability.
Defamation
Defamation: the act of communicating false statements about a person that injure the reputation of that person. — Merriam-Webster
A big chunk of the criticism of Telegram amounts to defamation, lies and arguments from authority. Unfortunately this is not an opinion, but a verifiable fact. Even more unfortunate is the fact that many of these come from respected figures of the computer security community. In this section I provide three examples, but you can easily find many more if you search a bit.
Disclaimer. I firmly believe one should assume good faith whenever possible. If someone says something wrong, they are probably just wrong, they didn’t necessarily mean any harm. However, one should also be careful not to be fooled. The quotes reported in this section are from experts of the field whose job relies on knowing these things, or from journalists whose job is to verify the truth of what they publish. It cannot be assumed they just did not know what they were talking about.
(Dec, 2015) Ptacek and Marlinspike
Take, for example, this tweet:
By default Telegram stores the PLAINTEXT of EVERY MESSAGE every user has ever sent or received on THEIR SERVER. — Thomas H. Ptacek (Dec, 2015)
Ptacek shows no proof for his statement. So, either he has proof to back his claim, but, for some reason, decided not to disclose it, or he is just lying.
Moreover, the intent is clearly sensationalist. They store EVERY MESSAGE on THEIR SERVER! Where else would they store them? On YOUR server?
The reply from Durov didn’t take long:
This is false: @telegram never stores plaintext of messages, and deleted messages are erased forever. Do you get paid for posting BS? — Pavel Durov
This is reasonable. What reason would Telegram have to store the plaintext of the messages? It doesn’t take that much effort to encrypt them at rest. Of course, they can still access them, but it is a very different thing that storing them without any encryption.
But wait for it, here is the icing on the cake:
@durov If you can’t admit you have plaintext access to everyone’s default msgs, I know for sure you’re being intentionally deceptive. — Moxie Marlinspike
Well done! First they lured Durov into replying to a bogus claim, and once he fell for it, Moxie delivered the final blow. 👏
Let me explain, in case this is not clear. Here Moxie is pretending the discussion was about having plaintext access, which obviously Telegram has for non-secret chats, instead of plaintext storage, which is what Ptacek was talking about. Then, he accused Durov of being deceptive if he cannot admit this secret truth.
To summarize, the original tweet could have been stated as: “By default Telegram does not use end-to-end encryption, and thus can potentially read your messages”. This could still have been interesting for users which do not know about the benefits of E2EE. However, without the capslock and the unfounded claim that they store the PLAINTEXT of EVERY MESSAGE, it probably gets a little less interesting.
(Nov, 2015) Daily Dot
Let’s see another example. On November 2015 the Daily Dot, publishes an article titled “Cryptography expert casts doubt on encryption in ISIS’ favorite messaging app”.
Telegram, the encrypted messaging app of choice for terrorist groups like ISIS, may not be as secure as the company wants people to think.
Telegram makes some pretty bold claims about the security of its application, but a cryptography expert said that the algorithm and methods that it uses to encrypt messages between users are “made up.”
“They basically made up a protocol,” Matthew Green, a professor of cryptography at Johns Hopkins University, told the Daily Dot. “According to their blog post, they have a couple of really brilliant mathematicians who aren’t really cryptographers but were smart so they came up with their own protocol. It’s pretty crazy. It’s not something that a cryptographer would use. That said, I don’t know if it’s broken. But, it’s just weird.”
Let’s see what we have here:
- Trying to damage Telegram's reputation by associating it with ISIS
- Argument from authority (an expert said this, it must be true)
- Ad hominem attack on Telegram's team
- Admitting in the end, that they don't really know if it's broken
There are even more unfounded claims in the article, if you are interested.
(Jun, 2016) Gizmodo
Lastly, I will show some highlights from a 2016 article by Gizmodo, titled “Why You Should Stop Using Telegram Right Now”. This article became quite popular and the Telegram Team even posted a reply titled “Should you stop reading Gizmodo right now?”. I encourage you to check them out both for a better understanding.
Here we go:
-
One major problem Telegram has is that it doesn’t encrypt chats by default
They are confusing encryption and end-to-end encryption.
-
Contrary to the opinions of almost every encryption and security expert, Telegram’s FAQ touts itself as more secure as WhatsApp. But in reality, WhatsApp uses the most highly praised encryption protocol on the market and encrypts every text message and call by default.
We already discussed why it is not sufficient to implement end-to-end encryption to have a secure app, and the flaws in WhatsApp’s implementation.
-
experts also indicate that the actual encryption technology is flawed
If it is that flawed, it should be easy to back these claims with some evidence.
-
That’s the trouble with security by obscurity. It’s usual for cryptographers to reveal the algorithms completely, but here we are in the dark. [..]
That’s just false. The protocol is open. The client is open source.
-
Unless you have considerable experience, you shouldn’t write your own crypto. No one really understands why they did that.
If you phrase it like that, it seems they were totally out of their minds. The truth is that they stated their reasons for going down that road, but it is clearly easier to say that “no one really understands”, instead of doing some research.
It is worth noting that in 2019, Gizmodo realized their article was wrong and amended it clarifying that “client-server communication is encrypted by default”.
Conclusion
Let us finally draw some conclusions from all this.
- Not everybody has the same security and privacy requirements. Nobody that I know of routinely compares public key fingerprints before starting every E2EE chat. Most of them care about device synchronization and not losing their chat history, though. These people are happily trusting the Service with their data and that is OK. It may be sad, but it is what it is. On the other hand, a whistleblower, or someone concerned about being targeted by a government, for example, is going to be much more careful and will probably take many extra steps to safe-guard their security.
-
Even if it is a very nice property to have, E2EE is hard to get down correctly and introduces a series of usability challenges. We showed how WhatsApp’s implementation is not solid because by default it totally ignores key changes and it encourages users to make unencrypted backups. Although it would be nice if there was a perfectly secure and privacy-respecting messaging app that also had great usability, we are simply not there yet.
-
Preventing the community from verifying the correctness of an E2EE implementation severely limits its usefulness. Having to completely trust the Service to implement it correctly, almost defeats its purpose. Unfortunately that is the case with WhatsApp, which has closed source apps, forbids reverse engineering and routinely blocks third party clients. Of course, in case you equally trust two Services and one claims to have better security, it is probably better to choose that one, even if you cannot verify their claims.
-
Be aware of who you are trusting. Using some apps might require a considerable level of trust in the Service, as is the case with both Telegram and WhatsApp. It is, thus, paramount to do some research and understand if that trust is totally misplaced or well deserved. This is more subjective than an exact since. However, some examples are to review the vulnerability history of the Service to assess how seriously they take the security of their product and to avoid trusting companies whose commercial interests consists in selling your private data.
I included the history of vulnerabilities of Telegram in this article, and planned to do the same for WhatsApp. However, WhatsApp’s history is much longer and deserves an article on its own, which I hope to write in the future.
-
Finally, denigrating others is not nice. Attacking and undermining them just because they are not part of the cool kids’ club is despicable and may take a toll on your reputation. Please stop.
Comments
Except when they want to chat with more than one person at once. Telegram does not have any support for encrypted group conversations.
Otherwise a good read. Telegram is not a bad app, but it does not suit my threat model. I'm willing to forgo cloud backups and some usability to have default encryption for all my conversations, which I think is something Signal provides. None of these apps are perfect, it comes down to what combination of trade-offs works best for you.
> I'm willing to forgo cloud backups and some usability to have default encryption for all my conversations, which I think is something Signal provides.
Indeed, it does.
> None of these apps are perfect, it comes down to what combination of trade-offs works best for you.
That is exactly the take-home message :)
Anyway, I mentioned that Telegram does not support e2ee for group chats here:
> WhatsApp nowadays has end-to-end encryption enabled by default for all chats, while Telegram has not enabled it by default and does not support it on group chats.
Note however that group chats are even more difficult to handle securely, because in theory you are supposed to verify the identity of every participant.
That said, I liked this article a lot since it puts things in a manner that focuses on how to approach these comparisons and backs them up with relevant information. Elsewhere, there’s too much of appeal to authority that ignore other points (mainly nuances that are important).
I completely agree with this part:
> A big chunk of the criticism of Telegram amounts to defamation, lies and arguments from authority. Unfortunately this is not an opinion, but a verifiable fact. Even more unfortunate is the fact that many of these come from respected figures of the computer security community.
A few corrections and additions are required in the article:
* The part about Signal not having a standalone desktop client is not true. This was already pointed out in another comment here. Signal has had this for a few years now.
* “This is more subjective than an exact since.” — there’s a typo here for “science”.
* I didn’t see mention of metadata collection by WhatsApp. That’s as important as the content of messages.
The author claims that this is "defamation" because Telegram uses FDE or a similar solution.
With the deliberate misunderstandings apparent in this article I don't see why it would be inappropriate to call the author out for being a Telegram shill.
>Here Moxie is pretending the discussion was about having plaintext access, which obviously Telegram has for non-secret chats, instead of plaintext storage, which is what Ptacek was talking about
The whole idea of "plaintext storage" is something that the author came up with themselves, tptacek claimed that Telegram "stores the PLAINTEXT of EVERY MESSAGE". These mean entirely different things. Plaintexts are still stored even if they are encrypted on disk with keys controlled by Telegram.
You even discuss this issue in the "History of Telegram vulnerabilities", but don't bother to mention the fact that this was almost certainly a deliberate backdoor.
You also seem to suggest that DUAL_EC_DRBG was promoted as a best practice by the crypto-community, what an utterly bizarre claim.
Of course, the mental gymnastics in the "Defamation" section make it clear that this was never intended to be a honest analysis.
Can govt or a company mass harvest chats to classify users into buckets? and use this data to manipulate people. We have seen this happen with Cambridge Analytica. Think of military having a list of all pro-democracy people before staging the coup.
In my opinion this is partially addresses in the threat modelling section, where I mention the need to trust "The companies running the servers needed by the app to work".
Anyway I believe the threat you mention is a very difficult one to defend against, because probably even metadata alone is sufficient to construct a graph of relations. So, I maybe wrong, but if you do not want to trust any company at all, then even Signal may not be enough for you in this scenario. Regarding the choice of WhatsApp vs Telegram for this scenario, you simply have to decide if you trust more Facebook (which we already know supplies this kind of mass data to the US government) or the Telegram team. Or you can trust neither.
My opinion on Signal is it should definitely be preferred if one cares about security more than usability. I really cannot wait for it to have a "standalone" client (that is, that does not require the phone to be online as well).
There are other messaging apps, like Element (and the now defunct Keybase) which try to solve the same problems. So, I decided to keep that discussion for another future article (maybe).
Signal does not require the phone to be online as well. Source: just switched my phone off and still able to send and receive messages on the desktop app. WhatsApp, however, still very much requires the phone to be online for its web/desktop clients to work.
That said, Signal is still not a "standalone" app on desktop because it needs me to have installed and set up the app on my phone to link it to desktop. After this though, they are very much independent clients.
What happened to me is that I lost my phone, so I did not have an Android device to re-install Signal. I later managed to get back the SIM card and I assumed that I could use the Desktop client, but if I remember correctly it did not work. However, I will check again all of this
Cryptography is a very complex field, Telegram has made many bizarre design decisions which make it difficult to trust them despite the fact that their encryption has not been publicly broken recently.