This is a continuation of my Security in Our Modern Times series, which can be found here and here. The story of the San Bernardino iPhone has gotten to the point where you just cannot make this stuff up. Let me give you a Reader’s Digest–type review of the story and then offer my opinion on the latest twist.

On December 2, 2015, fourteen people were killed and twenty-two seriously injured in a terrorist attack at the Inland Regional Center in San Bernardino, California. This was a terrible act of violence committed against the citizens of the United States. As it was such a high-profile case, an extensive investigation was launched to answer the five Ws: who, what, when, where, and why? This is expected such situations, but here is where things get interesting. The people who committed the heinous crime appear to have used the type of disposable, pay-as-you-go cell phones that can be bought at just about any convenience store. As far as I know, none of the burner phones were recovered, and data was deleted and wiped from multiple devices a day before the attack. However, an attacker’s work-issued iPhone was recovered. The FBI and the US Justice Department issued a court order for Apple to “develop and create” software to bypass iPhone security in order to give the Justice Department complete and unfettered access to the device.

In my opinion, Apple should not be forced to develop a “hack” or method to bypass the security of the device. It is unethical for the Justice Department to force Apple to develop something to bypass its own security. Apple has spent endless hours making its devices secure. It is just not right to try to force a company‌—‌any company‌—‌to do this. Why? It’s not so much an issue of misusing Apple’s technical expertise as it is one of setting a precedent in the courts‌—‌of allowing the Justice Department to push for a universal back door to any and all security.

By the time my second post was published, the US government had withdrawn its court order against Apple, because the Justice Department had achieved its goal of opening the device without Apple’s assistance. I am not sure if it was the government or a third party that opened the device, but since Apple was not involved, this issue is now moot. Nevertheless, I fear it is just a matter of time before the Justice Department will need assistance again, and I truly hope it doesn’t attempt to force a company to break its own product’s security for that assistance. From there, it’s only a few steps to the mandated creation of back doors that will make everything less secure for all.

Rumor has it that it was an Israeli forensic data company that helped the Justice Department access the iPhone’s data. The Justice Department released a statement that it was able to bypass the security of the device, had gained access to the data on it, and had withdrawn the court order. You would think Apple would be thrilled that this legal matter was over, but no, and—you cannot make this up—it is now requesting that the Justice Department share the methods that were used to bypass its security. This gets me to the point of this article.

I believe that Apple had the moral high ground in the beginning. However, since one of its primary arguments was against having to develop a back door, it does not have any grounds to request information on how the hack was achieved. Apple does not get to have its cake and eat it too. The Justice Department does not want to release the methods used to keep Apple from blocking the security flaw that gave it access, and that is exactly why Apple would like to know this information.

From my own personal experience, I can relate that there has been a large subculture of Apple iPhone users who “Jailbreak,” or root the device, to give themselves super-user access to it, allowing them to install a large catalog of software for rooted devices that is not present on iTunes. This subculture has been around for quite a while and has been playing a cat-and-mouse game with Apple since the beginning. The Jailbreak developers would find a flaw that they could use to further develop the Jailbreak software. Once that was released, Apple would reverse-engineer it to find the flaw and patch a fix into its software. Really, it is as though Apple has a security penetration team to find its flaws for it. Unlike the Jailbreak developers, however, the Justice Department won’t share its methods.

My final question is “Why should it matter?” The iPhone in question is an older device, and Apple should continue to focus on making the platform as secure as possible. If you really need to find the flaw for older devices, then just start with the Jailbreak community. Apple has no grounds to ask for any information on how the security was bypassed, and it should focus its attention on preparing for the inevitable Justice Department court order the next time an encrypted device is recovered. That is the fight that Apple needs to get ready for, in order to keep the illusion of security and privacy intact. These are my final thoughts on the matter. I ask, what are yours?