Throughout all the years I have been working in information technology, security has been an area that engineers have striven to improve. As a result, we have make our environments as secure as possible. We have always looked to make the security of our systems stronger. Security has evolved over time. One example of this evolution is the concept of password management. IT professionals have helped drive the change from simple passwords to more secure passphrases to two-factor authentication added as another layer of security.

Another area of security that has evolved substantially over time is the use of the encryption. What was once something you would see only when entering financial or personal information has evolved into a technology that is now encouraged and used for everything in the digital world. We are now at a point when almost all web pages are served securely. Let me use Google as an example. When you connect to google.com, your connection is automatically redirected from http to https. Personal computers, smartphones—just about anything with a processor—comes with an out-of-the-box function to encrypt all the data on that device, and this is encouraged by default.

Throughout my twenty years in the industry, we have looked to find weaknesses in our security setups and worked to plug those holes as they were discovered, but I have never been encouraged or requested to add any kind of back door though security or to purposely try to make systems less secure. This is what gets me to the point of this article.

On December 2, 2015, fourteen people were killed and twenty-two seriously injured in a terrorist attack at the Inland Regional Center in San Bernardino, California. This was a terrible act of violence committed against the citizens of this country. As it was such a high-profile case, an extensive investigation was launched to answer the five Ws: who, what, when, where and why. This is expected after any situation like this, but here is where things get interesting. It appears that the people who were involved in the heinous crime were using pay-as-you-go, also known as throw-away, cellphones that can be bought at just about any local convenience store. A work-issued iPhone was also recovered. As far as I know, none of the burner phones were recovered, and data was deleted and wiped from multiple devices a day before the attack. However, an attacker left his work-issued iPhone to be recovered. Now the FBI and the US Justice Department have issued a court order for Apple to “develop and create” software to bypass the iPhone security to give the Justice Department complete and unfettered access to the device.

Here lies the conflict that is being publicly debated in the media. Should Apple be forced to create and develop software to “hack” its own product? That is the question, but the answer, in my opinion, is much more complex with regard to what the media and Justice Department would like to portray as a simple situation.

As I have mentioned, never in my professional career have I ever been asked to develop or create any kind of security flaw or back door into the systems I have been tasked with developing or maintaining. That said, I have taken advantage of found security flaws or weaknesses to jailbreak or root Apple iPhones that I have owned over the years, and there has been an active community that has worked to find a flaw through which to develop the Jailbreak. Another software application has been developed to work on the rooted devices, so there have been methods and ways to bypass the Apple security, but I can be sure that none of the Apple developers had anything to do with the development of the software used to Jailbreak the systems. On the contrary, Apple has gone out of its way to reach out and recruit some of the Jailbreak developers to help find and fix security flaws and bugs in an attempt to make the Apple products as secure as they can possibly be. The point is, there have been ways to bypass Apple software in the past and these did not involve Apple doing any development to defeat its own security.

A company should not be forced to develop solutions to bypass its security or develop back doors in its own product. There will always be unintended consequences from such actions. Any holes in the security or integrity of the systems, intentional or otherwise, will be found and exploited—and that is not an “if” but a when. In the second part of this article, I am going to lay out the thoughts I have gathered after watching the bantering back and forth in the congressional hearing as the Justice Department and Apple lay the groundwork for their arguments. Join me for part two, and let’s talk about precedent.