Special Counsel Robert Mueller ran into a snag while investigating possible conspiracy between the Trump campaign and Russia during the 2016 election: encrypted messaging apps.

Some of the individuals he investigation used encrypted messaging apps that deleted relevant communications, so Mueller couldn’t corroborate some witnesses’ statements, according to his report released last week.

“The Office learned that some of the individuals we interviewed or whose conduct we investigated-including some associated with the Trump Campaign — deleted relevant communications or communicated during the relevant period using applications that feature encryption or that do not provide for long-term retention of data or communications records,” Mueller wrote in the report. “In such cases, the Office was not able to corroborate witness statements through comparison to contemporaneous communications or fully question witnesses about statements that appeared inconsistent with other known facts.”

While failing to obtain those communications didn’t derail Mueller’s investigation or affect his conclusion, it does revive the old debate over individual privacy vs. law enforcement and national security concerns.

When a married couple carried out a mass shooting in San Bernardino, California in 2015, the Federal Bureau of Investigation (FBI) tried to force Apple to unlock an iPhone belonging to one of the perpetrators to access data necessary to the FBI’s investigation of the case.

Apple refused. The FBI sued. A day before they went to court, the feds figured out how to withdraw data from the iPhone without Apple’s help and dropped the case.

Stuart Madnick, a professor of information technology and engineering systems at MIT’s Sloan School of Management, told InsideSources that the difficulty with these cases is, once law enforcement compromises a tech product’s security, customers are unlikely to trust the product manufacturer again, because they know their privacy is at risk. It becomes a slippery slope into other concerns about government surveillance and unwarranted government searches of one’s property.

“The general concern people in the technology field have is, once you provide a mechanism for law enforcement to get access, the trouble is, how do you know how limited that use will be?” Madnick said. “Most people, not everybody, but most people in general sided with Apple in resisting the government’s attempt to break into the iPhone.”

Madnick also brought up the Lavabit case. Lavabit was a encrypted email startup that offered exceptional cybersecurity for sensitive emails, but when the feds demanded founder and CEO Ladar Levison turn over the encryption keys so they could read Edward Snowden’s emails, Levison had to shut down his business.

If he had turned over the encryption keys, he would hve nullified his own product — encryption and security — and torpedoed the company, so either way, he didn’t have much choice.

“He didn’t have the press coverage, and the resources to handle the legal fees, so he went out of business, because he couldn’t afford to fight the FBI,” Madnick said. “The only thing that made Lavabit a useful service was that it was secure. If it’s not secure, you don’t have customers.”

In other words, Apple got lucky in the San Bernardino case.

“If Apple was faced with the decision of fighting the FBI or going out of business, would they still fight the FBI? The ethical issues are interwoven with business issues,” Madnick said.

Now, end-to-end encrypted messaging apps — like Signal, which is used by Edward Snowden, and Telegram, which allow you to permanently delete messages after you’ve read them, or Wickr, which doesn’t store any user data or messages — could face similar scrutiny if lawmakers conclude they substantially interfered in the Mueller investigation.

At a recent privacy forum hosted by the American Enterprise Institute (AEI), the Justice Department’s Chief Privacy Officer Peter Winn pushed back on the idea of consumer control and argued tech companies should share user data with law enforcement when necessary for public safety. He described this as the “trust model” approach to privacy, and it’s popular among some privacy experts and tech companies.

Roslyn Layton, a visiting tech scholar with AEI, thinks the U.S. should use a trust-based approach to privacy, arguing there should be a balance between an “individual’s right to privacy and the public’s right to know,” per the Mueller investigation.

“Essentially the trust model is one which the responsibility for privacy is shared between the user and the platform with the regulator on hand,” she told InsideSources in an email. ” You can liken it to the schoolyard where kids self-police. If it gets out of hand, the teacher can intervene. However if the teacher micromanages the yard, it’s no fun. It works best when kids self-regulate. We need command and control for very specific things, e.g. medical data breach notification.”

Madnick offered another complication to the juxtaposition of privacy and security: terrorists also use end-to-end encrypted messaging apps to communicate and make plans. So should it be up to the federal government’s discretion to decide when to demand or hack into Americans’ encrypted messages?

It’s a slippery slope, with surveillance-state-style tactics on one end and preventing potential terrorist attacks on the other. Madnick teaches a cybersecurity and ethics course at MIT, and at the beginning of the first class, he gives his students three polls.

“One is, how important to you is it that we have privacy? The second is how important to you is it that we have good security? And then, in order to have your privacy you have to give up a lot of security? On a five point scale, where are you? And not surprisingly people are pretty well distributed across the scale,” he said. “When there’s a tradeoff, everyone picks a different point where they’re comfortable.”

Follow Kate on Twitter