Tag Archives: iPhone

Scary iPhone malware that steals your data is a reminder no platform is ever safe.

Iphone_IOS

If you haven’t done so already, go and update your iPhone, iPad or iPod touch to iOS 9.3.5 right now. To update, go to Settings > General > Software Update.

It may not seem urgent because it’s only a “point release,” but the update is crucial or you risk having all of your data secretly stolen by invisible malware that can install itself on your device and even uninstall itself without leaving any traces behind.

Two reports from the New York Times and Motherboard published on Thursday detail how three major security holes, patched via the update, could be exploited by hackers to track and steal practically all of the private data on your iOS device.

According to both reports, Ahmed Mansoor, a human rights activist from the United Arab Emirates, discovered the vulnerabilities when he received a suspicious text message with a link that would have provided “new secrets about torture of Emiratis in state prisons.”

Had Mansoor clicked on the link, he would have been directed to a website that would have exploited all three security holes and installed malware onto his iPhone, giving remote hackers full access to his device.

Thankfully, Mansoor didn’t click the link. Instead, he alerted Citizen Lab, an interdisciplinary lab based at the Munk School of Global Affairs at the University of Toronto that focuses its research on the intersection of human rights and security.

Citizen Lab identified the link as belonging to NSO Group, an Israel-based “cyberwar” company reportedly owned by American venture capital firm Francisco Partners Management, which sells spyware solutions to government agencies.

Along with additional research from cybersecurity firm Lookout, it has been revealed the three exploits (dubbed “Trident”) are “zero-day” level, meaning the malware kicks in immediately as soon as it’s activated (in this case, once the link is opened, the malware automatically installs itself and starts tracking everything).

“Once infected, Mansoor’s phone would have become a digital spy in his pocket, capable of employing his iPhone’s camera and microphone to snoop on activity in the vicinity of the device, recording his WhatsApp and Viber calls, logging messages sent in mobile chat apps, and tracking his movements,” writes Bill Marczak and John Scott-Railton, two Citizen Lab senior researchers.

According to Lookout, the software is highly flexible and can be configured in a number of ways to target different countries and apps:

The spyware capabilities include accessing messages, calls, emails, logs, and more from apps including Gmail, Facebook, Skype, WhatsApp, Viber, FaceTime, Calendar, Line, Mail.Ru, WeChat, SS, Tango, and others. The kit appears to persist even when the device software is updated and can update itself to easily replace exploits if they become obsolete.

Upon discovery, the two organizations immediately notified Apple and the iPhone maker immediately got to work on iOS 9.3.5, which was released on Thursday.

Though Trident and the type of malware NSO sells (called “Pegasus”) is mainly used by governments to target dissidents, activists and journalists in volatile countries like United Arab Emirates, Mexico, Kenya, Mozambique, Yemen and Turkey, it can be used to target any iOS device.

The very idea of having all your data stolen without any real effort should scare everyone into updating their iOS devices.

As we’ve entrusted our smartphones and tablets with more and more of our personal data, it’s more important than ever to always be running the latest software with the most up-to-date security patches to prevent digital spying and theft.

Quicker to protect iOS than Android

It took 10 days for Apple to release an update to close the holes after Citizen Lab and Lookout alerted the company.

Ten days may seem like a long time, but when you compare it to how long it would take for Android devices to get updated for such a critical patch, it’s like hyper speed.

One of the benefits of iOS is its tightly-integrated software and hardware. Because there are fewer devices and they all run the same core software, Apple can test and deploy security updates quickly and easily with fewer chances of something going wrong.

Android, on the hand, is fragmented into tens of thousands of distinct devices, and customized in too many versions for even the most diehard Android fan to remember. This makes it extremely challenging for phone makers to test and release updates to plug up dangerous security holes quickly.

Google’s Nexus devices are quicker to get software updates because they all run stock Android and Google can push them out in a similar way to Apple. Same goes for Samsung and its Galaxy phones.

But there’s often little incentive for Android phone makers to update their devices. Software maintenance is costly and that’s why you’ll see many Android devices from lesser-known brands either update their phones months or years later or never at all.

No platforms are ever truly secure

The publishing of the security flaws and how serious it could be if you were to fall victim invites another conversation: media portrayal.

Android bears the brunt when it comes to being portrayed as the less secure platform, but as this revelation has revealed, no matter which platform is really more secure, all platforms are susceptible to hackers.

Security is an ongoing and never-ending battle between phone makers like Apple and Google and hackers. It’s a constant cat-and-mouse game where each side is always one step ahead or behind the other.

Had Mansoor not alerted Citizen Lab, the Trident exploit would have continued to exist without anyone knowing. Lookout believes the malware has existed since iOS 7. NSO Group’s Pegasus malware can also be used to target Android and BlackBerry devices, too.

While no platform will ever be truly secure, updating to the latest version of your phone’s software is the best way to remain safe.

 

Beware! Your iPhone Can Be Hacked Remotely With Just A Message

Iphone_Hack
In Brief
Do you own an iPhone? Mac? Or any Apple device?
Just one specially-crafted message can expose your personal information, including your authentication credentials stored in your device’s memory, to a hacker.
The vulnerability is quite similar to the Stagefright vulnerabilities, discovered a year ago in Android, that allowed hackers to silently spy on almost a Billion phones with just one specially-crafted text message.

Cisco Talos senior researcher Tyler Bohan, who discovered this critical Stagefright-type bug in iOS, described the flaw as “an extremely critical bug, comparable to the Android Stagefright as far as exposure goes.”

The critical bug (CVE-2016-4631) actually resides in ImageIO – API used to handle image data – and works across all widely-used Apple operating systems, including Mac OS X, tvOS, and watchOS.

All an attacker needs to do is create an exploit for the bug and send it via a multimedia message (MMS) or iMessage inside a Tagged Image File Format (TIFF).

Once the message received on the victim’s device, the hack would launch.

“The receiver of an MMS cannot prevent exploitation and MMS is a store and deliver mechanism, so I can send the exploit today and you will receive it whenever your phone is online,” Bohan quoted as saying by Forbes.

The attack could also be delivered through Safari web browser. For this, the attacker needs to trick the victim into visiting a website that contains the malicious payload.

In both the cases, no explicit user interaction would be required to launch the attack since many applications (like iMessage) automatically attempt to render images when they are received in their default configurations.

It is quite difficult for the victim to detect the attack, which if executed, could leak victims’ authentication credentials stored in memory such as Wi-Fi passwords, website credentials, and email logins, to the attacker.

Since iOS include sandbox protection to prevent hackers exploiting one part of the OS to control the whole thing, a hacker would require a further iOS jailbreak or root exploit to take total control of the complete iPhone.

However, Mac OS X does not have sandbox protection that could allow an attacker to access the Mac computer remotely with the victim’s passwords, potentially making users of Apple’s PCs completely vulnerable to the attack.

Apple has patched this critical issue in iOS version 9.3.3, along with patches for other 42 vulnerabilities, including memory corruption bugs in iOS’ CoreGraphics that helps render 2D graphics across those OSes, according to Apple’s advisory.

Apple also addressed serious security vulnerabilities in FaceTime on both iOS and OS X platforms, allowing anyone on the same WiFi network as a user to eavesdrop on the audio transmission from FaceTime calls even after the user had ended the call.

“An attacker in a privileged network position [could] cause a relayed call to continue transmitting audio while appearing as if the call terminated,” reads Apple description.

The FaceTime vulnerability (CVE-2016-4635) was discovered and reported by Martin Vigo, a security engineer at Salesforce.

So users are advised to patch their devices as it would not take enough time for bad actors to take advantage of the vulnerabilities, which are now known.

Apple hires Encryption Expert to Beef Up Security on its Devices

 

Apple
The FBI and other law enforcement agencies have waged legal war on encryption and privacy technologies.

You may have heard many news stories about the legal battle between Apple and the FBI over unlocking an iPhone that belonged to the San Bernardino shooter. However, that was just one battle in a much larger fight.

Now, in an effort to make its iPhone surveillance-and-hack proof, Apple has rehired security expert and cryptographer Jon Callas, who co-founded the widely-used email encryption software PGP and the secure-messaging system Silent Circle that sells the Blackphone.

This is not Apple’s first effort over its iPhone security.

Just a few months back, the company hired Frederic Jacobs, one of the key developers of Signal — World’s most secure, open source and encrypted messaging application.

Now Apple has rehired Callas, who has previously worked for Apple twice, first from 1995 to 1997 and then from 2009 to 2011.

During his second joining, Callas designed a full-disk encryption system to protect data stored on Macintosh computers.

Apple’s decision to rehire Callas comes after rumors that the company is working on improving the security of its iOS devices in such a way that even Apple can’t hack.

“Callas has said he is against companies being compelled by law enforcement to break into their own encrypted products,” the report reads.

“But he has also said he supports a compromise proposal under which law enforcement officials with a court order can take advantage of undisclosed software vulnerabilities to hack into tech systems, as long as they disclose the vulnerabilities afterward so they can be patched.”

Earlier this year, Apple was engaged in a battle with the US Department of Justice (DoJ) over a court order asking the company to help the FBI unlock iPhone 5C of San Bernardino shooter Syed Farook.

Basically, the company was deliberately forced to create a special, backdoored version of its iOS, so that the FBI may be able to Brute Force the passcode on Farook’s iPhone without losing the data stored in it.

Although Apple refused to do so, and now the Apple wanted to remove its own ability to break its iPhone security in future iPhone models, thereby eliminating the chances for government and intelligence agencies for demanding backdoors.

 

Apple v. FBI: How to Sound Smart about Encryption

Encryption

Apple v. FBI has started a serious debate about the line between security and privacy. The FBI says this is a case about the contents of one specific iPhone 5c. Apple says this is a case about securing data for everyone.

No one seems to want to have a civil, Socratic discussion about what it means to evolve the governance of a digital democracy. Instead, most people want to voice their opinions about terrorism, the law, and Apple. People also want to know if this particular iPhone 5c (or any iPhone) can be hacked, and if offers to hack it from white hat hackers, such as John McAfee, are real.

The Apple v. FBI subject device, an iPhone 5c, can be hacked. This is true because of iOS 8 (the operating system running on the subject device) and the way all iPhone 5c’s were manufactured. Current vintage iPhones (5s, 6, 6s) could not be hacked the same way, so we should not be talking about this particular phone; we should be talking about encryption writ large, and how it is used in our daily lives.

What Is Encryption?

Encryption is the process of using algorithms to encode information with the specific goal of preventing unauthorized parties from accessing it. For digital communication, there are two popular methods of encryption: symmetric key and public key.

  • Symmetric key encryption requires both the sending and receiving parties to have the same key – hence the term “symmetric.”
  • Public key encryption is far more popular because the encryption key is publicly available, but only the receiving party has access to the decryption key.

How Can There Be Such a Thing as a “Public” Encryption Key?

One of the most popular ways to create public encryption keys is to use a mathematical problem known as prime factorization (aka integer factorization). You start with two relatively large prime numbers. (Quick 6th Grade Math Refresher: A prime number is only divisible by 1 and itself.) Let’s call them P and P. When you multiply them, the product is a composite number we’ll call “C.”

(P x P = C)

C is a very special number with very special properties. It’s called a semiprime number. Semiprime numbers are only divisible by 1, themselves and the two prime factors that made them. This special property enables the number to be used for public key encryption.

You use C for the public key and you keep P and P as the private key pair. While it is very easy to generate C, if the number is large enough and thoughtfully generated, it can take thousands, millions or even billions or trillions of tries to factor. (There are mathematical strategies to speed up the process, but in practice, prime factoring must be done by trial and error.)

Pretty Good Privacy, the Encryption We Mostly Use

The OpenPGP standard is one of the most popular versions of public key encryption, aka Pretty Good Privacy or PGP. There is a very good chance that your corporate IT department uses some version of PGP to encrypt your files – after all, it’s pretty good.

How good? Using current computer technology, a 2048-bit OpenPGP encrypted file cannot be decrypted. Someday it might be possible with a fully functional quantum computer, but these are still, for all practical purposes, theoretical devices.

Now, you’re going to push back with an argument that goes something like this: “Hey Michael, you may think that a file encoded with 2048-bit OpenPGP encryption is unbreakable, but you don’t know that for sure. You have no idea what the NSA can or cannot do! How do you know that quantum computers don’t exist? Nothing is impossible!”

Yeah … no. 2048-bit OpenPGP encryption can’t be decrypted without a key because of the way computers work today. In the future, with new hardware and processor and bus speeds that are currently undreamt of, the computation may be able to be done in reasonable time – but not today. Without your private key, the computational time required to break a 2048-bit key in a secure SSL certificate would take over 6.4 quadrillion years.

How Can the “Now Famous” iPhone 5c Be Hacked?

For the iPhone 5c in question, you don’t need to hack the encryption key; you need to “make” the encryption key. It is generated from a combination of the user-created PIN or password and a unique key that Apple embeds in each iPhone 5c when it is manufactured. The FBI is asking Apple to create a new operating system with the ability to disable certain security protocols – specifically to defeat the limit on failed passcode attempts and to remove the delay caused by failed attempts. With this new weaker security protocol and forensic software written to try every possible PIN or password combination, the FBI hopes to regenerate the unique key required to open the phone.

It is important to note that this whole idea is only possible on iPhones older than the 5c running iOS 8 or earlier. iPhones with fingerprint scanners such as the 5s, 6 and 6s use a second processor called “secure enclave.” Even Apple can’t hack an iPhone that includes a secure enclave processor – not without creating a “backdoor.”

This is what Apple is worried about. You should be too. If the government served Apple with a lawful writ or subpoena to deliver the key to an iPhone 6s, it would not be able to comply. This case asks the question, should the government be allowed to compel any company that creates a digital security product to create a “backdoor” and make it available for any reason (lawful or other)?

The important thing about an iOS 9 “backdoor” in Apple’s case is that it could not be guessed or randomly generated; it would have to be an actual file – a metaphorical “skeleton key.” There’s a problem with skeleton keys, even digital ones: they can be copied. Importantly, they can be copied or stolen without the owner’s knowledge. The idea of creating a “skeleton key” defeats the purpose of encrypting it in the first place. If a key exists, it will be copied by both good and bad actors – that’s just a fact of digital life.

So again, I find myself begging you to engage in a civil, Socratic discussion about what kind of future we want to live in. Encryption enables banking (commercial and consumer) and commerce. Without it, our digital lives would be very, very different. How do you want to evolve the governance of our digital democracy? Where is the line between security and privacy? What do we want to ask our lawmakers to do? Hopefully this short story will inspire you to learn more about encryption so you can draw your own conclusions and join this techno-political debate.

John McAfee Reveals To FBI, On National TV, How To Crack The iPhone (RT Interview)

YouTubeYes, it has gotten this bad. In language simple enough for even a child to understand, John McAfee explains for the world and for the FBI how to hack…

Not as easy as John says, but it can be done !!!

Actually about that encryption. What’s the key? Salt of the key depends on unique device ID. Another part of the key must depend either on the fingerprint ID (which is easy enough, you don’t need the guy alive to get his fingerprints, people even leave fingerprints everywhere), or on a 4-digit PIN. Once you have code injection and can hack out the try counter and have a more direct path to inject the PIN numbers into the key generation algorithm, you can brute force them in a matter of minutes.

Former NSA Chief Michael Hayden Sides With Apple, Though Admits ‘No Encryption Is Unbreakable’

iphone-6-plus-event-2014-billboard-650
An attendee demonstrates the new Apple Inc. iPhone 6 Plus after a product announcement at Flint Center in Cupertino, California, U.S., on Tuesday, Sept. 9, 2014. Apple Inc. unveiled redesigned iPhones with bigger screens, overhauling its top-selling product in an event that gives the clearest sign yet of the company’s product direction under Chief Executive Officer Tim Cook.
David Paul Morris/Bloomberg via Getty Images

Tim Cook‘s opinion that Apple should not develop a way to hack into the encrypted phone belonging to one of the San Bernardino shooters has earned an endorsement from an unlikely source, though it comes with a big “but.” Michael Hayden, the former NSA director and CIA chief — so, a bonafide spy guy, told the Wall Street Journal that America is “more secure with unbreakable end-to-end encryption,” calling it a “slam dunk” if you view it in the scope of the “broad health” of the United States.

Hayden said FBI director James Comey‘s demand for Apple to give them a tool to break into Syed Farook’s iPhone is “based on the belief that he remains the main body, and that you should accommodate your movements to the movements of him, which is the main body. I’m telling you, with regards to the cyber domain, he’s not — you are.”

Now for that “but,” which will surely disappoint all the (temporarily pleased) civil libertarians out there. Hayden said that following a setback in the mid-nineties, when the NSA failed to convince manufacturers to adopt a cryptographic device called the Clipper chip, “we then began the greatest 15 years in electronic surveillance.” The controversial chipset was an encryption device that had a built-in backdoor in case the government needed to take a lookie-loo. But, as Hayden notes, “we figured out ways to get around the quote-unquote unbreakable encryption. Number one, no encryption is unbreakable. It just takes more computing power. Number two, the way we worked around encryption is bulk collection and metadata.”

Watch the conversation:

Since 2014, Apple’s iPhones have had built-in encryption that makes it so the contents of a device can only be accessed via a phone’s passcode. The FBI’s order stipulates that Apple provide software to work only on the San Bernardino shooter’s iPhone. Cook said in an open letter that the U.S. government order would undermine encryption and potentially create a “master key, capable of opening hundreds of millions of locks” on private devices.

Cook wrote that “in the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession… The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

On Wednesday, Cook’s position received support from a high-profile colleague in tech.

“Forcing companies to enable hacking could compromise users’ privacy,” wrote Google CEO Sundar Pichai in a series of Twitter posts. “We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders. But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent. Looking forward to a thoughtful and open discussion on this important issue.”

 

Apple IOS Forensic Primer

iPhoneThe Operating System that Apple licenses to its users is IOS. It is resident and runs on their mobile devices (like the IPOD, IPhone and the IPAD). Legally, Apple specifically states it retains ownership of the IOS. There is legal precedent being argued (by the US DOJ) that will hold Apple to its continued ownership interest in IOS. This means the company can potentially be subpoenaed to assist Law Enforcement in exploitation of software on a target phone (which runs the IOS) in the execution of a search warrant.

While authorities wait for the decision on this particular legal argument, IOS forensics is necessary if the Apple device (in question) has been used in or found to be evidence in a crime. While the DOJ argues the precedent that “a product’s continued ownership interest in a product after it is sold obliges the company to act as an agent of the state”, the administrator needs to be able to pull data off of that device immediately during the conduct of an investigation. Even if an administrator is just trying to see if the user is violating (or has violated) company policy, there is a need to be able to access the data on the device.

There is a lot of data that gets stored on IPhones. Some people have more data on their IPhone than they have on their computers. If you browse the phones hard drive (typically this is done with a phone disk tool) you will not be able to see the full file system but, if you could see it, it bears a strong resemblance to the “MAC OS”. The MAC OS x” is built on a core called “Darwin” and the IPhone has all of the directory structure that the Mac operating system has.

For example, the maximum number of allocation blocks per volume that File Manager can access on a Mac OS system is 65,535. The IOS is basically a “MAC OS” system that has been tuned and tailored to operate on the smaller mobile devices which have different processors in them.

As we examine the directories and analyze their subdirectories, we see what is available as we dig down inside the device. The “DCIM” directory holds the “100 Apple” directory which will show the administrator where all of the pictures are. We also have a downloads directory (which holds all downloads), an iTunes directory (which holds all mp3 files), etc. The significance is that all of these directories give you the ability to see user data on a particular system.

Another place you can go looking for system information is in a terminal window. The terminal window gives an administrator the ability to use the command line interface to examine the device and the device data. Complete device access can be obtained when the “sudo” super user command is invoked. You will type;

$sudo su clyde (The user becomes a super user)

$cd (change directory)

$pwd (Here, we print the working directory)

$/user/clyde (This is our current directory)

The terminal window gives us the ability to examine the data inside the device as a super user (which gives us complete access to the system). When we look inside a device as a super user we know we will have the ability to access all additional files in the system. Instead of looking at the phone itself with different tools, you can analyze the system through a terminal command line.

$cd Library/ (change directory to the Library)
#cd ApplicationSupport/ (Change directory to the ApplicationSupport directory)

$ls (we list the contents of the directory, while we look for the MobileSync directory)

An administrator can examine and analyze the device’s “mobile sync” in relation to the computer the device has been syncing with.

$cd MobileSync (change directory to the MobileSync directory)

$ls (list the contents of the directory)

Backup (this is the contents of the directory)

$cd Backup (Change directory to the backup directory)

$ls (This will list all of the backups in the backup directory)

This is significant because in addition to examining the device data, I can pull up all of the “Backups” and select one of the backups. There is a lot of data stored in the backups. These files are just the backup information that has been stored on the hard drive. When the connected device (whether it is an IPad or IPhone) has its data copied onto the computer, in addition to being able to look at the directory on the phone itself using a utility like “Phone disk”, an administrator could also analyze the data in the backup. If you don’t have the phone but you have the computer, you may have almost as good a set of information as if you did have the phone because the backup stores a lot of information. It has to store all of the information you would need to restore the phone. The backup has got to store everything about your phone that you had previously.

If you have a user’s computer and you find the IPhone backups, you have the information that was stored on the phone. There are utilities that can be used to analyze these IPhone backups which have the ability to extract information from them. This will give an administrator the ability to examine all of the data that was captured in the scheduled backups.

When you are performing IOS forensics, there is not only the question of looking at the phones data because; sometimes an administrator won’t be able to obtain access to the data if the phone has a “Pass Code”. However, if you have access to the backup directory on the computer that the phone “syncs” with, you may have a better chance of getting the data from that device and doing your forensic analysis on the phone while you are actually working on the computer where the backups are stored. This is what eliminates IOS’s ability to thwart administrators and Law Enforcement from performing a forensic analysis.

Read more: Apple IOS Forensic Primer http://www.sooperarticles.com/technology-articles/mobile-computing-articles/apple-ios-forensic-primer-1453263.html#ixzz40dsmaebc
Follow us: @SooperArticles on Twitter | SooperArticles on Facebook

Google CEO Sides With Apple And Tim Cook, Opposes FBI’s Demand For iPhone Backdoor

apple-googleGoogle’s CEO Sundar Pichai has joined a number of other high profile individuals in expressing his opinions on FBI’s request for Apple to provide backdoor access to an iPhone 5c that forms part of the San Bernardino shooting case. A federal judge has ruled that Apple must indeed assist law enforcement in granting access to a seized iPhone 5c that belonged to one of the shooters accused of killing 14 individuals in California. Commenting on the situation via the use of social media, Sundar Pichai called it a “troubling precedent”.

If you weren’t privy to the whole situation, then it’s probably worth noting that Apple’s CEO Tim Cook almost instantly responded to the ruling with a public and open message to Apple’s customers. In addition to providing a little insight into the ruling and how it came about, Cook also took the opportunity to inform the customers that Apple would be contesting the ruling, claiming that the FBI essentially wants Apple’s engineers to create a new version of iOS that comes with the ability to circumvent very specific security features (read: backdoor access). Cook clearly doesn’t want to have to build in a backdoor to the iPhone or iPad.



Google’s CEO didn’t instantly get involved in the situation, but has since posted a series of tweets which show that he sides with Tim Cook and Apple as a whole. Most notably, Pichai’s five tweets on the predicament claimed Apple’s acceptance of the ruling, if that was indeed the company’s stance, “could compromise a user’s privacy”. He also stated publicly that acceptance of a ruling to provide access to data based on valid legal order is “wholly different than requiring companies to enable hacking of customer devices & data”. It’s difficult to disagree with those views.

Of course, not everyone weighing in with an option on the San Bernardino iPhone situation is fully accepting of Apple’s stance on the ruling. Republic candidate, and general worldwide laughing stock, Donald Trump, predictably doesn’t agree with Tim Cook’s decision to resist the order, stating that he agrees “100 percent with the courts” and calling Apple “Who do they think they are?”.

We’re pretty sure that the public backing of a fellow CEO in the position of Pichai carries a whole lot more importance than the negativity of Mr. Trump.

You can follow us on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple and the Web.

The FBI vs. Apple – thoughts and comments

A_Line_In_The_SandNot trying to provide the full story here, just a few thoughts and directions as to security, privacy and civil rights. (for the backdrop – Apple’s Tim Cook letter explains it best: https://www.apple.com/customer-letter/)

From a technical perspective, Apple is fully capable to alleviating a lot of the barriers the FBI is currently facing with unlocking the phone (evidence) in question. It is an iPhone 5C, which does not have the enhanced security features implemented in iPhones from version 5S and above (security enclave – see Dan Guido’s technical writeup here: http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/).

Additionally, when dealing with more modern versions, it is also feasible for Apple to provide updates to the security enclave firmware without erasing the content of the phone.

But from a legal perspective we are facing not only a slippery slope, but a cliff as someone eloquently noted on twitter. Abiding by a legal claim based on an archaic law (All Writs act – originally part of the Judiciary act of 1789) coupled with just as shaky probably cause claim, basically opens up the door for further requests that will build up on the precedent set here if Apple complies with the court’s order.
One can easily imagine how “national security” (see how well that worked out in the PATRIOT ACT) will be used to trump civil rights and provide access to anyone’s private information.

We have finally reached a time where technology, which was an easy crutch for law enforcement to rely on, is no longer there to enable spying (legal, or otherwise) on citizens. We are back to a time now where actual hard work needs to be done in order to act on suspicions and real investigations have to take place. Where HUMINT is back on the table, and law enforcement (and non-LE forces) have to step up their game, and again – do proper investigative work.

Security is obviously a passion for me, and supporting (and sometimes helping) it advance in order to provide everyone with privacy and comfort has been my ethics since I can remember myself dealing with it (technology, security, and privacy). So is national security and the pursuit of anything that threatens it, and I don’t need to show any credentials for either.

This is an interesting case, where these two allegedly face each other. But it’s a clear cut from where I’m standing. I’ve said it before, and I’ll say it again: Tim Cook and Apple drew a line in the sand. A very clear line. It is a critical time now to understand which side of the line everybody stands on. Smaller companies that lack Apple’s legal and market forces, which have bent over so far to similar “requests” from the government can find solace in a market leader drawing such a clear line. Large companies (I’m looking at you Google!) should also make their stand very clear – to support that line. Crossing that line means taking a step further towards being one of the regimes we protect ourselves from. Dark and dangerous ones, who do not value life, who treat people based on their social, financial, racial, gender, or belief standing differently. That’s not where or who we want to be.

Or at least I’d like to think so.