Privacy

3505 readers
460 users here now

Welcome! This is a community for all those who are interested in protecting their privacy.

Rules

PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!

  1. Be civil and no prejudice
  2. Don't promote big-tech software
  3. No apathy and defeatism for privacy (i.e. "They already have my data, why bother?")
  4. No reposting of news that was already posted
  5. No crypto, blockchain, NFTs
  6. No Xitter links (if absolutely necessary, use xcancel)

Related communities:

Some of these are only vaguely related, but great communities.

founded 9 months ago
MODERATORS
526
527
 
 

cross-posted from: https://lemm.ee/post/56591279

Swedish government wants a back door in signal for police and 'Säpo' (Swedish federation that checks for spies)

Let's say that this becomes a law and Signal decides to withdraw from Sweden as they clearly state that they won't implement a back door; would a citizen within the country still be able to use and access Signals services? Assuming that google play services probably would remove the Signal app within Sweden (which I also don't use)

I just want the government to go f*ck themselves, y'know?

528
 
 

FBI Warns iPhone, Android Users—We Want ‘Lawful Access’ To All Your Encrypted Data By Zak Doffman, Contributor. Zak Doffman writes about security, surveillance and privacy. Feb 24, 2025

The furor after Apple removed full iCloud security for U.K. users may feel a long way from American users this weekend. But it’s not — far from it. What has just shocked the U.K. is exactly what the FBI told me it also wants in the U.S. “Lawful access” to any encrypted user data. The bureau’s quiet warning was confirmed just a few weeks ago.

The U.K. news cannot be seen in isolation and follows years of battling between big tech and governments over warranted, legal access to encrypted messages and content to fuel investigations into serious crimes such as terrorism and child abuse.

As I reported in 2020, “it is looking ever more likely that proponents of end-to-end security, the likes of Facebook and Apple, will lose their campaign to maintain user security as a priority.” It has taken five years, but here we now are.

The last few weeks may have seemed to signal a unique fork in the road between the U.S. and its primary Five Eyes ally, the U.K. But it isn’t. In December, the FBI and CISA warned Americans to stop sending texts and use encrypted platforms instead. And now the U.K. has forced open iCloud to by threatening to mandate a backdoor. But the devil’s in the detail — and we’re fast approaching a dangerous pivot.

While CISA — America’s cyber defense agency — appears to advocate for fully secure messaging platforms, such as Signal, the FBI’s view appears to be different. When December’s encryption warnings hit in the wake of Salt Typhoon, the bureau told me while it wants to see encrypted messaging, it wants that encryption to be “responsible.”

What that means in practice, the FBI said, is that while “law enforcement supports strong, responsibly managed encryption, this encryption should be designed to protect people’s privacy and also managed so U.S. tech companies can provide readable content in response to a lawful court order.” That’s what has just happened in the U.K. Apple’s iCloud remains encrypted, but Apple holds the keys and can facilitate “readable content in response to a lawful court order.”

There are three primary providers of end-to-end encrypted messaging in the U.S. and U.K. Apple, Google and Meta. The U.K. has just pushed Apple to compromise iMessage. And it is more than likely that “secret” discussions are also ongoing with the other two. It makes no sense to single out Apple, as that would simply push bad actors to other platforms, which will happen anyway, as is obvious to any security professional.

In doing this, the U.K. has changed the art of the possible, bringing new optionality to security agencies across the world. And it has done this against the backdrop of that U.S. push for responsible encryption and Europe’s push for “chat control.” The U.K has suddenly given America’s security agencies a precedent to do the same.

“The FBI and our partners often can’t obtain digital evidence, which makes it even harder for us to stop the bad guys,” warned former director Christopher Wray, in comments the bureau directed me towards. “The reality is we have an entirely unfettered space that’s completely beyond fully lawful access — a place where child predators, terrorists, and spies can conceal their communications and operate with impunity — and we’ve got to find a way to deal with that problem.”

The U.K. has just found that way. It was first, but unless a public backlash sees Apple’s move reversed, it will not be last. In December, the FBI’s “responsible encryption” caveat was lost in the noise of Salt Typhoon, but it shouldn’t be lost now. The tech world can act shocked and dispirited at the U.K. news, but it has been coming for years. While the legalities are different in the U.S., the targeted outcome would be the same.

Ironically, because the U.S. and U.K. share intelligence information, some American lawmakers have petitioned the Trump administration to threaten the U.K. with sanctions unless it backtracks on the Apple encryption mandate. But that’s a political view not a security view. It’s more likely this will go the other way now. As EFF has warned, the U.K. news is an “emergency warning for us all,” and that’s exactly right.

“The public should not have to choose between safe data and safe communities, we should be able to have both — and we can have both,” Wray said. “Collecting the stuff — the evidence — is getting harder, because so much of that evidence now lives in the digital realm. Terrorists, hackers, child predators, and more are taking advantage of end-to-end encryption to conceal their communications and illegal activities from us.”

The FBI’s formal position is that it is “a strong advocate for the wide and consistent use of responsibly managed encryption — encryption that providers can decrypt and provide to law enforcement when served with a legal order.”

The challenge is that while the bureau says it “does not want encryption to be weakened or compromised so that it can be defeated by malicious actors,” it does want “providers who manage encrypted data to be able to decrypt that data and provide it to law enforcement only in response to U.S. legal process.”

That’s exactly the argument the U.K. has just run.

Somewhat cynically, the media backlash that Apple’s move has triggered is likely to have an impact, and right now it seems more likely we will see a reversal of some sort of Apple’s move, rather than more of the same. The UK government is now exposed as the only western democracy compromising the security for tens of millions of its citizens.

Per The Daily Telegraph, “the [UK] Home Office has increasingly found itself at odds with Apple, which has made privacy and security major parts of its marketing. In 2023, the company suggested that it would prefer to shut down services such as iMessage and FaceTime in Britain than weaken their protections. It later accused the Government of seeking powers to 'secretly veto’ security features.”

But now this quiet battle is front page news around the world. The UK either needs to dig in and ignore the negative response to Apple’s forced move, or enable a compromise in the background that recognizes the interests of the many.

As The Telegraph points out, the U.S. will likely be the deciding factor in what happens next. “The Trump administration is yet to comment. But [Tim] Cook, who met the president on Thursday, will be urging him to intervene,” and perhaps more interestingly, “Elon Musk, a close adviser to Trump, criticised the UK on Friday, claiming in a post on X that the same thing would have happened in America if last November’s presidential election had ended differently.”

Former UK cybersecurity chief Ciaran Martin thinks the same. “If there’s no momentum in the U.S. political elite and US society to take on big tech over encryption, which there isn’t right now, it seems highly unlikely in the current climate that they’re going to stand for another country, however friendly, doing it.”

Meanwhile the security industry continues to rally en masse against the change.

“Apple’s decision,” an ExpressVPN spokesperson told me, “is deeply concerning. By removing end-to-end encryption from iCloud, Apple is stripping away its UK customers’ privacy protections. This will have serious consequences for Brits — making their personal data more vulnerable to cyberattacks, data breaches, and identity theft.”

It seems inconceivable the UK will force all encrypted platforms to remove that security wrap, absent which the current move becomes pointless. The reality is that the end-to-end encryption ship has sailed. It has becomne ubiquitous. New measures need to be found that will rely on metadata — already provided — instead of content.

Given the FBI’s stated position, what the Trump administration does in response to the UK is critical. Conceivably, the U.S. could use this as an opportunity to revisit its own encryption debate. That was certainly on the cards under a Trump administration pre Salt Typhoon. But the furor triggered by Apple now makes that unlikely. However the original secret/not secret news leaked, it has changed the dynamic completely.

529
530
 
 

cross-posted from: https://lemmy.world/post/26006683

This really hit home for me:

What now? Companies need to do a better job of only collecting the information they need to operate, and properly securing what they store. Also, the U.S. needs to pass comprehensive privacy protections. At the very least, we need to be able to sue companies when these sorts of breaches happen (and while we’re at it, it’d be nice if we got more than $5.21 checks in the mail). EFF has long advocated for a strong federal privacy law that includes a private right of action.

531
 
 

cross-posted from: https://fosstodon.org/users/notesnook/statuses/114059550980301173

Choose your warrior:

All of these are open source, private and encrypted. Of course, Notesnook is still the best 😉

#notetaking, #privacy, #security, #notesnook, #opensource

532
533
77
submitted 5 months ago* (last edited 5 months ago) by fxomt to c/privacy
 
 

A lot of people seem to be confused so to clear up: They haven't broken encryption. They are phishing using malicious QR codes.

Russia-backed hacking groups have developed techniques to compromise encrypted messaging services, including Signal, WhatsApp and Telegram, placing journalists, politicians and activists of interest to the Russian intelligence service at potential risk.

Google Threat Intelligence Group disclosed today that Russia-backed hackers had stepped up attacks on Signal Messenger accounts to access sensitive government and military communications relating to the war in Ukraine.

Analysts predict it is only a matter of time before Russia starts deploying hacking techniques against non-military Signal users and users of other encrypted messaging services, including WhatsApp and Telegram.

534
 
 

So, I want to encrypt my files with Cryptomater before they go to my cloud based backup service. Lets say I use Dropbox.

So I know I create a Cryptomater vault and give the location as a folder in Dropbox.

I can't see that Vault until I open it in Cryptomater, right? This means I can't add anything to that Vault unless its open on my machine. As its open, I'm assuming that the data I'm adding is unencrypted until I close the Vault?

Lets say I add a plain text file to an open Vault.

So, at what point does Dropbox upload that file? Is it the minute its added to the Dropbox environment? Because that would mean its unencrypted.

Or is it not uploaded until the moment the Cryptomater vault is closed? Because that would mean I'd either have to leave the Vault open the entire time I was on my device and possibly have to do one (potentially) big upload at the end of the day maybe or keep opening and closing the Vault every time I wanted to work with the Vault (edit an existing document, add a new one, delete one etc).

Or have I misunderstood the process? I hope so because it either sounds not very secure or not very usable.

535
110
Removing Jeff Bezos From My Bed (trufflesecurity.com)
submitted 5 months ago by [email protected] to c/privacy
 
 

Alternative article: 'Silicon Valley’s Favorite Mattress, Eight Sleep, had a backdoor to enable company engineers to SSH into any bed'

536
537
 
 

Tech group says it can no longer offer advanced protection to British users after demand for ‘back door’ to user data https://archive.is/NI01z

Apple withdraws cloud encryption service from UK after government order Tech group says it can no longer offer advanced protection to British users after demand for ‘back door’ to user data

Apple said current UK users of the security feature will eventually need to disable it © REUTERS Apple is withdrawing its most secure cloud storage service from the UK after the British government ordered the iPhone maker to grant secret access to customer data.

“Apple can no longer offer Advanced Data Protection (ADP) in the United Kingdom to new users and current UK users will eventually need to disable this security feature,” the US Big Tech company said on Friday.

Last month, Apple received a “technical capability notice” under the UK Investigatory Powers Act, people familiar with the matter told the FT at the time.

The request for a so-called “backdoor” to user data would have enabled law enforcement and security services to tap iPhone back-ups and other cloud data that is otherwise inaccessible, even to Apple itself.

The law, dubbed a “Snooper’s Charter” by its critics, has extraterritorial powers, meaning UK law enforcement could access the encrypted data of Apple customers anywhere in the world, including in the US.

This is a developing story

538
 
 

This simple guide explains how to identify and remove common spyware apps from your Android phone.

539
540
541
 
 

cross-posted from: https://lemmy.ml/post/26220818

I am shocked by this - the quote in below is very concerning:

"However, in 2024, the situation changed: balenaEtcher started sharing the file name of the image and the model of the USB stick with the Balena company and possibly with third parties."

Can't see myself using this software anymore...

542
543
544
49
submitted 5 months ago* (last edited 5 months ago) by shaytan to c/privacy
 
 

https://soatok.blog/2025/02/18/reviewing-the-cryptography-used-by-signal/

A very good, extensive and interesting read on cryptography, centered around Signal (my daily driver), from the same guy who has previously analyzed Telegram and Session.

545
546
 
 

cross-posted from: https://feddit.org/post/8126174

“Today the Sheriff acknowledged that dystopian program violated the Constitution and agreed never to bring it back.”

I dunno about you guys but this case was the proverbial "straw that broke the camel's back" that made me start taking privacy seriously.

tl;dr Pasco County, FL was running a "predictive policing" program where they would use "a glorified Excel spreadsheet" to predict crimes and an algorithm would spit out "potential criminals" in the area. Most of them ended up being children. After that they would harass their families' day and night until they either committed a crime and went to jail or moved out of the county (which was the intention all along).

God Bless the IJ for taking up this cause and shutting it down, because it is honestly terrifying. It's a rare W for privacy. However I'm sure we haven't seen the last of "predictive policing" and we should remain vigilant.

and here's the video they made about it in 2022

547
548
19
submitted 5 months ago by fxomt to c/privacy
 
 
549
 
 

Does anyone else here use https://cryptpad.fr/ ?

I'm loving it so far- it's probably the best privacy focused g suite alternative I've found. It's easy enough to use that even the non-technical among us can use it

550
 
 

by Lars Wilderang, 2025-02-11

Translation from the Swedish Origin

In a new instruction for fully encrypted applications, the Swedish Armed Forces have introduced a mandatory requirement that the Signal app be used for messages and calls with counterparts both within and outside the Armed Forces, provided they also use Signal.

The instruction FM2025-61:1, specifies that Signal should be used to defend against interception of calls and messages via the telephone network and to make phone number spoofing more difficult.

It states, among other things:

“The intelligence threat to the Armed Forces is high, and interception of phone calls and messages is a known tactic used by hostile actors. […] Use a fully encrypted application for all calls and messages to counterparts both within and outside the Armed Forces who are capable of using such an application. Designated application: The Armed Forces use Signal as the fully encrypted application.”

The choice of Signal is also justified:

“The main reason for selecting Signal is that the application has widespread use among government agencies, industry, partners, allies, and other societal actors. Contributing factors include that Signal has undergone several independent external security reviews, with significant findings addressed. The security of Signal is therefore assumed to be sufficient to complicate the interception of calls and messages.

Signal is free and open-source software, which means no investments or licensing costs for the Armed Forces.”

Signal supports both audio and video calls, group chats, direct messages, and group calls, as well as a simple, event-based social media feature.

The app is available for iPhone, iPad, Android, and at least desktop operating systems like MacOS, Windows, and Linux.

Since Signal can be used for phone calls, the instruction is essentially an order for the Armed Forces to stop using regular telephony and instead make calls via the Signal app whenever possible (e.g., not to various companies and agencies that don’t have Signal), and no SMS or other inferior messaging services should be used.

Note that classified security-protected information should not be sent via Signal; this is about regular communication, including confidential data that is not classified as security-sensitive, as stated in the instruction. The same applies to files.

The instruction is a public document and not classified.

Signal is already used by many government agencies, including the Government Offices of Sweden and the Ministry for Foreign Affairs. However, the EU, through the so-called Chat Control (2.0), aims to ban the app, and the Swedish government is also mulling a potential ban, even though the Armed Forces now consider Signal a requirement for all phone calls and direct messaging where possible.

Furthermore, it should be noted that all individuals, including family and relationships, should already use Signal for all phone-to-phone communication to ensure privacy, security, verified, and authentic communication. For example, spoofing a phone number is trivial, particularly for foreign powers with a state-run telecom operator, which can, with just a few clicks, reroute all mobile calls to your phone through a foreign country’s network or even to a phone under the control of a foreign intelligence service. There is zero security in how a phone call is routed or identified via caller ID. For instance, if a foreign power knows the phone number of the Swedish Chief of Defence’s mobile, all calls to that number could be rerouted through a Russian telecom operator. This cannot happen via Signal, which cannot be intercepted.

Signal is, by the way, blocked in a number of countries with questionable views on democracy, such as Qatar (Doha), which can be discovered when trying to change flights there. This might serve as a wake-

https://cornucopia.se/2025/02/forsvarsmakten-infor-krav-pa-signal-for-samtal-och-meddelanden/

view more: ‹ prev next ›