That, however, is hardly perfect protection.
In October, Apple filed a response to a New York court's order that asked about the feasibility of gaining access to private data on an encrypted iPhone. It repeated numerous previous statements from Apple executives:
"For devices running iOS 8 or higher, Apple would not have the technical ability to do what the government requests take possession of a password protected device from the government and extract unencrypted user data from that device for the government."
The Federal Bureau of Investigation, clearly, wasn't satisfied with this answer. In the Farook case, presumably after trying all the other ways of getting at an iPhone user's data, it actually provided the judge with a technical description of what it wanted Apple to do. Here it is, cited in Judge Sheri Pym's order:
Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the subject device.
"The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory.
"The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI."
As Apple explained in less technical language in its message to customers, this amounts to designing a special version of the iOS operating system that could be loaded onto Farook's iPhone to give the FBI access to the data stored on it.
The FBI and the court are not actually asking the company to decrypt the phone: They just want the custom iOS version to disable the feature that erases the data on the phone after 10 unsuccessful attempts to break the password.
Disabling it would allow officials to just break the password by "brute force," bombarding the phone with tens of millions of possible character combinations.
Apple can no longer say that is not technically possible, because it is. Instead, in the message to customers, it talks about the absence of guarantees that the iOS version allowing for the unlimited electronic input of passwords will be used only once.
For now, Apple is in its customers' corner. But I'm not sure it will stay there forever: Creating an iOS version with a backdoor is not its last line of defense.
The iPhone maker also accuses the government of asking it to hack its own customers, though technically, the FBI intends to do the hacking itself it just needs an opening to do it.
Those who think encryption protects their personal data from the government or, for that matter, from anyone determined enough to invest the effort in a brute force attack are naive. Any encryption can be broken.
Customer protection is entirely in the hands of the software companies that make household-name products, and they will pursue it only as long as that's in their business interests.
For now, Apple is in its customers' corner. But I'm not sure it will stay there forever: Creating an iOS version with a backdoor is not its last line of defense.
If Farook had used a device with the Google-designed Android operating system, the FBI might not even be asking for court orders. Although user content is encrypted on Android devices, too, Android is open-source software.
Theoretically, the government can produce its own version of the system that would make it possible to hack the encryption. By choosing a product from a company that is paranoid about patent protection, Apple customers have made their data somewhat safer but still not completely safe.
Apple's righteous fight for its customers is generating enough favorable publicity to eliminate the sour taste of the Error 53 "right to repair" scandal.
If Apple argues that compliance with the court order is "unreasonably burdensome," it soon may be asked -- and ordered - - to produce the iOS source code so that the government may attempt to modify it independently.
Courts have ordered source code handovers in the past. One series of such cases involved breathalyzer test software that defense lawyers in drunk driving cases wished to inspect for errors.
In 2009, Facebook was ordered to release its entire source code to Leader Technologies, which had sued it for patent infringement (Facebook fought this and succeeded in having the Leader patent invalidated three years later).
China is already asking U.S. companies to show their source code to government experts to prove it doesn't present security threats. IBM complied with such a request last year.
Apple's stand is now an election issue. "Who do they think they are?" asked Donald Trump. "They have to open it up."
According to Trump, this is only common sense; Apple executives may end up feeling that way, too, if the alternative is to hand over its fiercely protected code to the FBI. What if it leaks out?
For now, Apple's righteous fight for its customers is generating enough favorable publicity to eliminate the sour taste of the Error 53 "right to repair" scandal.
At some point, however, the company might have to decide whether further resistance poses a danger for Apple's closed ecosystem. Then, privacy-minded iPhone owners may find themselves on their own.
Leonid Bershidsky, a Bloomberg View contributor, is a Berlin-based writer.