- Tim Cook is right. Privacy is very important. The problem is the world we live in.
- Safety is critical but Apple is bringing a logical fight to an emotional battlefield.
- Is encryption in a terrorist threatened society a good thing? Apple has more to lose.
Apple (NASDAQ:AAPL) is now in the spotlight because it is not responding to a Judges order to disable the auto-erase function and allow the FBI to attempt unlimited number of passcodes to unlock the phone.
Encryption is needed because it protects people's privacy. Consumers are storing an incredible amount of data on their phones; financial, personal and health information. This data needs to be protected.
According to Tim Cook, Apple has so far been able to strike a balance between helping authorities and protecting consumer information. But the FBI is now asking Apple to essentially make a "safe system less safe."
"The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control." - Tim Cook, February 16, 2016, "A Message to Our Customers."
But we want to make clear what exactly a "back door" means in this case. The FBI is not asking Apple to "open" the iPhone for them. Apple does not have users passwords stored somewhere in the cloud. Therefore, Apple cannot open anyone's iPhone. What the FBI wants is for Apple to delete the self-destructive capabilities of the iPhone operating system used as protection when someone tries to illegally access your information.
Once the auto-erase function is disabled, it would allow the FBI to open the iPhone by "brute force" as it could try out thousands or millions of combinations to unlock the iPhone.
Apple Has A Logical Point
This is not just a case about one iPhone. The FBI is not just asking Apple to open one iPhone, it is asking the company to circumvent several security features and create a new version of the iPhone operating system - a backdoor.
This backdoor has broader implications. The problem is that this backdoor creates glitches in the system. This is because creating a master key or simply a new version of the iPhone operating system that allows the usage of a master key, means that government agencies, terrorists, cyber criminals can have the potential to unlock any iPhone in the world.
Looking forward, we believe that the backdoor technique can create a new black market product. Simply put, once created, the technique becomes a reality and thus will just create another variable black market product. Meaning that the best hackers will race to develop something similar and can sell it to the best bidder. The problem is that most people who are interested in your personal data are not doing it for the right reasons. The bidders might be terrorists. Meaning you sacrifice the security of many innocent people to catch one terrorist.
But This Can Ruin Apple's Brand Name Down The Line
- Logic vs. Emotion.
Do not be misled, issues of this nature will always exist as long as terrorism remains an issue. Maybe this time, Apple's brand can emerge unscathed but what happens when a similar situation occurs again in the future? This sends out a message to different people, terrorists included, that the iPhone is secure. If any terrorist activity happens on a large scale or cannot be prevented because of Apple's cryption, Apple risks being associated with aiding terrorism.
Imagine the reaction if this case transpired soon after 9/11? Or let's suppose a similar issue is raised in France right now. France recently experienced the worst carnage in recent memory when atleast 128 people were killed in the Paris and Saint-Denis shootings and bombings.
But let's look at it another way, what if something similar happened in China and the Chinese government wanted the information?
Would Apple risk their fastest growing market for encryption again? This can happen in a country that is less democratic.The legal repercussions and brand damage that could be incurred might not be worth it.
Apple launched its smartphone-based payment system in China where the electronic payments market is dominated by AliPay. China is well-known for being invasive in terms of individual privacy. At some point, the Chinese government will need to monitor its people's activities and Apple might need to cooperate.
- Risks: Encryption vs. Brand Name.
Apple's core competence is not in encryption. Therefore, defending encryption is great but it exposes the brand name and company's mission up for public scrutiny. The narrative here will always be different. It will always be Apple refusing to help the FBI in a terrorist investigation.
If such issues persist, imagine the morale among employees seeing how their great innovation has been used for the wrong reasons and they cannot do anything about it?
Tim Cook's original strategy would work here. Tim Cook's original position on product development was to be market oriented rather than product oriented. We covered this philosophy in-depth 8-months ago in the article,"Overlooked Upside Potential In Apple's Acquisition Of Beats Electronics." Soon after my coverage, it started to manifest e.g. Apple changed its phone size for the first time to prevent losing customers to Samsung. They need to revisit that philosophy because many people might not value privacy over safety very much.
- Lessons from PRISM: Misconception That Privacy Triumphs Safety.
In early June of 2013, The Guardian reported that, "The National Security Agency is currently collecting the telephone records of millions of US customers of Verizon, one of America's largest telecoms providers, under a top secret court order issued in April."
But this was just the first of many documents leaked by Edward Snowden, a former contractor for the U.S. government, and later reported in the Washington Post and Guardian that,"the NSA tapped directly into the servers of nine internet firms, including Facebook, Google, Microsoft and Yahoo, to track online communication in a surveillance programme known as Prism."
Everything is being digitized. From medical records to financial data. People give up information more often than they should from their social networking profiles to their professional profiles.
Edward Snowden proved it, if the FBI really wanted to get personal data for Apple users, they have the NSA to ask.
Conclusion - The Future of Encryption
We are in unfortunate times where innocent people are being relentlessly butchered and consistently threatened by terrorists. People have acknowledged and internalized this rising threat. They have felt its consequences through 9/11 and mourned alongside families of victims in the worst Paris carnage in recent memory on the evening of November 13th, 2015. If Encryption continues to stand in the way of matters of national security, governments might end up banning encrypted phones.
This is why people choose safety over privacy. We are in a world where privacy is an idea. People will not stop buying iPhones because Apple created a backdoor. As a matter of fact, because this is an emotionally driven situation, people would understand if Apple helped. How they will do it, will matter. If they trust each other more than the FBI, then they should find a way to do it in-house without giving the FBI any access.
Snowden made it clear, Big Brother is watching! People didn't flinch. If you understand the extent to which the NSA violates individual privacy, it will send shivers down your spine. Compared to the NSA, this is a walk in the park.
Resisting will only affect the company's brand name. As long as terrorism remains a huge global threat, encryption business will always face the wrath of impatient government agencies and grieving citizens. And in the end, individual safety will triumph.