Low Chance Telegram Actively Helps the FSB — Higher Chance the FSB Can Spy Anyway
This post has been entirely generated by Grok. I was curious what it thinks about the state of the Telegram in 2025.
The KGB (Committee for State Security) was the Soviet-era intelligence agency dissolved in 1991. Its primary successor is Russia's Federal Security Service (FSB), which handles domestic intelligence and surveillance. Any reference to "KGB" in this context likely means the FSB or Russian state security apparatus. Telegram, founded by Russian expatriate Pavel Durov in 2013, has a complex history with Russian authorities—Durov fled Russia in 2014 after refusing FSB demands for user data from his earlier platform, VKontakte. However, recent investigations and policy shifts have raised questions about potential indirect or future data access by Russian entities.Based on available evidence as of December 2025, the chances of Telegram directly sharing user data (e.g., IP addresses, phone numbers, or message content) with the FSB are low but not zero—estimated at 10-20%. This assessment factors in Telegram's public denials, its operational independence from Russia (headquartered in Dubai), and Durov's ongoing legal battles with Western governments, which make overt cooperation with Russia politically and legally risky. However, the risk of indirect access or surveillance is higher (40-60%), due to documented ties in Telegram's server infrastructure to FSB-linked entities. These could enable traffic monitoring or metadata extraction without Telegram's explicit consent. Telegram's post-2024 policy shift toward broader data sharing with any government (initially limited to terrorism cases) increases the theoretical risk, as Russia could issue a "valid legal request" if Telegram deems it legitimate. This is not a probabilistic model but an informed estimate drawn from investigative reports, transparency data, and expert analyses. Key factors influencing the odds:
- Low direct risk drivers: No confirmed instances of FSB data handovers in Telegram's reports; Durov's anti-Russian stance (e.g., enabling uncensored Ukraine war coverage); potential backlash from Telegram's 1 billion+ global userbase if exposed.
- Elevated indirect risk drivers: Server operators with FSB contracts; Russia's SORM (System for Operative Investigative Activities) laws mandating telecom surveillance; FSB's history of intercepting Telegram traffic in occupied regions.
- Wildcards: Escalating global pressures (e.g., Russia's 2025 restrictions on Telegram for non-compliance) could force concessions, or Durov's French legal issues might indirectly benefit Russian narratives.
Key Evidence from Investigations (2025 Focus) Recent reports highlight infrastructure vulnerabilities rather than proven data dumps. Here's a summary:
Source | Date | Key Findings | Implications for FSB Access |
|---|---|---|---|
IStories / OCCRP ("Telegram, the FSB, and the Man in the Middle") | June 2025 | Telegram's servers (including in Miami) are managed by Global Network Management (GNM), owned by Vladimir Vedeneev. His other firms (e.g., Globalnet) have FSB contracts for traffic monitoring via Deep Packet Inspection (DPI) and SORM compliance. Vedeneev has exclusive access to Telegram hardware. | Enables potential "man-in-the-middle" surveillance of metadata (e.g., auth_key_ids for tracking users globally). No proof of content decryption, but FSB could monitor connections without Telegram's involvement. |
Moscow Times / Kyiv Post | June 2025 | Vedeneev's companies serviced Russian defense/intel agencies; 2022 "compromise" allegedly allowed FSB-monitored infrastructure for terrorism probes. | Suggests backdoor-like access for targeted users (e.g., in Ukraine/Russia). Experts like Zair-Bek Vepkhvashvili warn of "undisclosed cyber espionage tools." |
Meduza / Department One ("Chats and Punishment") | June 2025 | FSB charged Russians with treason based on intercepted bot interactions with Ukrainian channels; auth_key_ids exposed via SORM or server access. | Demonstrates real-time FSB monitoring of private messages in specific cases, possibly via infrastructure ties rather than direct shares. |
The Guardian / Various | Aug 2025 | Russia throttled Telegram/WhatsApp for refusing to localize data and cooperate with Roskomnadzor/FSB. | Indicates non-cooperation now, but ongoing pressure could lead to future deals. No data shared yet per reports. |
WIRED (historical context) | 2023 (relevant to patterns) | Russian firm TGStat (using Telegram API) archives public data and must hand it to FSB by law. | API enables mass surveillance of public channels without Telegram's direct role, but scales to private if combined with infrastructure access. |
No 2025 reports confirm direct FSB handovers, but the pattern echoes Durov's 2013 VKontakte cooperation (pre-exile), fueling skepticism. Evidence from Telegram to Counter DoubtsTelegram has provided some assurances, but they are largely declarative (e.g., blog posts, denials) rather than forensic proof like server audits or zero-knowledge verifications. Critics argue this leaves room for doubt, especially given the infrastructure issues. Durov and Telegram emphasize resistance to authoritarian overreach, but evidence is more about what they claim not to do than verifiable non-sharing.Primary Evidence Provided:
- Official Denials and Rebuttals:
- In June 2025, Telegram rejected the IStories/OCCRP investigation as "misleading," stating Vedeneev's role is "nominal" (technical support only) and GNM complies with standard telecom laws but doesn't enable surveillance.occrp.org They accused reporters of "technological incompetence" and threatened lawsuits.
- Durov's public statements (e.g., on X/Twitter) highlight anti-Russian actions: Telegram defied 2018-2020 Russian blocks, hosts opposition channels, and shares no data with Moscow. In May 2025, he tweeted about resisting French "geopolitical" requests targeting conservatives, positioning Telegram as neutral.moonlock.com
- Transparency Reports:
- Quarterly reports (via t.me/transparency) detail global requests: Q1 2025 showed 0 disclosures to Russia (vs. 9,941 users to India, 3,777 to Germany).moonlock.com +1Post-2024 policy update, disclosures surged (e.g., 900+ to US in late 2024), but Russia remains at zero—framed as evidence of selective compliance (only "valid" requests for TOS violations like terrorism/cybercrime).
- Reports claim "0 bytes" shared pre-2024; now include metadata only (no content). However, raw GitHub data lacks full verifiability, drawing criticism for opacity.moonlock.com
- Technical and Historical Assurances:
- Encryption Claims: Secret chats are end-to-end encrypted (E2EE); cloud chats use server-client encryption. Telegram states governments can't access content without device compromise.tandfonline.comNo backdoors admitted, unlike WhatsApp.
- Independence Proof: Servers distributed globally (not Russia-based); Durov's exile and French citizenship (despite 2024 arrest) show distance from FSB. In 2022, Telegram resisted Indian requests beyond minima, and in 2025, it ignored Russian localization demands.kremlingram.org +1
- Policy Limits: Data shared only for confirmed criminals violating TOS; private chats "not processed" for requests.bitdefender.comDurov argues this balances privacy with anti-abuse (e.g., moderating CSAM).
Limitations of Telegram's Evidence
- No Independent Audits: Unlike Signal, no third-party code reviews or cryptographic proofs of non-sharing.
- Metadata Risks Unaddressed: Reports cover IP/phone but not how infrastructure ties expose auth_key_ids or traffic patterns.
- Selective Transparency: Zero for Russia could mean non-requests or rejections—but without request logs, it's unverifiable.
- Post-Arrest Shifts: 2024 policy change (after Durov's French detention) broadened sharing, eroding "never complied" claims and inviting speculation about future Russian leverage.darkreading.com +1
In summary, Telegram's evidence reduces some doubt for direct sharing but does little for indirect risks. For high-stakes users (e.g., activists), alternatives like Signal (full E2EE by default) are safer. If new reports emerge (e.g., via Durov's ongoing French case), this assessment could shift.
What Makes Signal Better Than Telegram?
I prompted to ask who funds Telegram, how is the security better, who audits the software, how do we know the right software is being run, etc.
Signal is widely regarded as superior to Telegram for users prioritizing privacy and security, particularly in high-risk scenarios like activism, journalism, or evading surveillance. While Telegram excels in features (e.g., large channels, bots, and multi-device sync without extra setup), it trades some security for convenience—default chats are not end-to-end encrypted (E2EE), and it collects more metadata. Signal's design philosophy is "privacy by default," making it the gold standard recommended by experts like the Electronic Frontier Foundation (EFF) and even integrated into apps like WhatsApp.Here's a quick comparison of key aspects:
Aspect | Signal | Telegram |
|---|---|---|
Encryption | E2EE by default for all messages, calls, and groups using the open-source Signal Protocol (audited and adopted by others like WhatsApp). Forward secrecy ensures past messages stay safe if keys are compromised. | E2EE only in optional "Secret Chats" (MTProto protocol, not fully open-source or audited). Default cloud chats use server-client encryption, allowing Telegram access. |
Data Collection | Minimal: Only your phone number (now hideable via usernames) and last connection timestamp. No IP logging, contact lists, or interaction history. | Collects IP addresses, device info, username changes, and metadata for "spam prevention." Stores messages on servers for easy sync. |
Open-Source | Fully open-source clients and most servers (AGPL-3.0 license). Reproducible builds let you verify the app matches the code. | Clients open-source, but servers and MTProto protocol are closed-source, limiting audits. |
Metadata Protection | Strong: No linking of who talks to whom; sealed sender hides identities. | Weaker: Can track interactions via metadata; public channels expose more. |
Additional Security Features | Disappearing messages, view-once media, screen security (blocks screenshots), relay calls to hide IP. | Self-destructing media (in Secret Chats), two-step verification, but requires manual activation for most protections. |
Transparency & Audits | Regular reports; code reviewed by cryptographers (e.g., Matthew Green). No backdoors possible due to E2EE. | Some audits, but closed elements raise doubts; criticized for non-standard crypto. |
Trade-offs | Fewer features (no massive groups or channels); requires phone number for signup (though usernames mitigate this). | More fun (stickers, polls, unlimited group size up to 200,000), but less secure out-of-the-box. |
In short, if security is your top concern, Signal is "better" because it minimizes trust in the company—servers can't read messages, and there's less data to leak or hand over. Telegram is fine for casual use but needs tweaks (e.g., always using Secret Chats) to approach Signal's level.How Does Signal Fund Itself?Signal operates as a non-profit through the Signal Foundation (a 501(c)(3) organization founded in 2018 by Moxie Marlinspike and Brian Acton). It doesn't sell ads, user data, or premium features—its entire model is donation-driven to avoid incentives for surveillance or profit-chasing. Operating costs have ballooned to ~$50 million annually (as of 2023, covering servers for 70M+ monthly users), funded by:
- Donations: Primarily small voluntary contributions from users (e.g., $3–$5 via in-app prompts). These now cover ~25% of costs, up from 18% in 2022. Larger one-time gifts from privacy advocates also help.
- Grants and Initial Funding: Early support from the Open Technology Fund (~$3M, U.S. government-backed for anti-censorship tech) and foundations like Knight and Shuttleworth. In 2018, Acton provided a $50M zero-interest loan (now ~$105M, repayable in 2068), which jumpstarted growth.
- No Ads or Tracking: Unlike for-profits, Signal reinvests everything into development. It's exploring optional paid backups (e.g., full media storage) but keeps core features free.
This model ensures independence—Signal can't be pressured by investors or sold off like WhatsApp was to Meta.Who Runs Signal's Servers, and How Do We Know the Software Is Untampered?Signal's servers are operated by the Signal Foundation itself (via its subsidiary, Signal Messenger LLC), hosted on cloud infrastructure (likely AWS or similar, though not publicly specified for security reasons). A small team of ~30–40 engineers maintains them, with no third-party outsourcing that could introduce risks. The servers relay encrypted messages only—they can't decrypt or tamper with content due to E2EE, so even a compromised server wouldn't expose your chats.As for verifying the open-source software is actually running on those servers (and not tampered with): This is a valid "trust but verify" concern in open-source services. Signal addresses it transparently, but it's not foolproof—ultimate trust relies on the foundation's integrity, backed by cryptographic design. Here's how we know:
- Fully Open-Source Code (Mostly): All client apps (Android, iOS, desktop) and server code are public on GitHub under AGPL-3.0. Anyone can download, build, and audit it for backdoors. The sole exception is the anti-spam module (closed-source to prevent abusers from reverse-engineering it), but this doesn't touch encryption or message routing.
- Reproducible Builds for Clients: You can compile the exact app from source and generate a binary matching the official download (via SHA-256 hashes). Instructions are on GitHub—security researchers do this routinely to confirm no hidden changes.
- Server Verification Challenges and Mitigations:
- No Direct Remote Proof: You can't "ping" a server to prove it's running exact code (a hard problem Moxie Marlinspike explored in 2017 using DRM-like tech, but it's unsolved at scale). However, Signal's E2EE means servers are "dumb relays"—tampering would break delivery or trigger client-side alerts (e.g., safety numbers mismatch for contacts).
- Community Audits and Transparency: Independent experts (e.g., via EFF or academic reviews) regularly audit the code. Signal publishes transparency reports on legal requests (e.g., only phone numbers handed over, never content). No scandals or leaks in 10+ years build credibility.
- Design Minimizes Server Trust: Even if servers ran altered code, they couldn't read messages (only metadata like timestamps, which is minimal). Clients verify each other's identities via safety numbers, ensuring end-to-end integrity.
- Real-World Proof: Events like the 2024 xz-utils backdoor (a supply-chain attack on open-source Linux) highlight risks, but Signal's isolated, audited repo and non-profit status reduce them. If tampered, it'd likely be detected in code reviews or by users noticing anomalies.
For paranoid users: Run your own Signal server (possible via open-source code, though not officially supported for production). Overall, Signal's track record—used by journalists, activists, and even governments—makes it far more verifiable than closed-source alternatives. If doubts persist, tools like reproducible builds and protocol audits provide strong assurance.