Tag Archives: Apple

People are talking about hackers ‘ransoming’ Apple — here’s what’s actually going on

If you don’t want to be hacked, don’t use the same password across different services.

And if you’re an Apple user, it’s a good idea to check your Apple ID and iCloud account today to make sure it’s using a unique and long password.

On Wednesday, a hacking group calling itself the Turkish Crime Family told Business Insider that it had about 600 million iCloud passwords it would use to reset users’ accounts on April 7.

Apple told Business Insider in a statement that if the hackers had passwords, they did not come from a breach of Apple systems:

“There have not been any breaches in any of Apple’s systems including iCloud and Apple ID. The alleged list of email addresses and passwords appears to have been obtained from previously compromised third-party services.

“We’re actively monitoring to prevent unauthorized access to user accounts and are working with law enforcement to identify the criminals involved. To protect against these type of attacks, we always recommend that users always use strong passwords, not use those same passwords across sites and turn on two-factor authentication.”

It is still possible that the group has some users’ passwords. Information from several large breaches, including those of Yahoo and LinkedIn, have spread across the internet in recent years. If an Apple user has the same password and email for, say, LinkedIn and iCloud, there’s a good chance that iCloud password is already publicly available.

Here’s what you can do to protect yourself:

Turn on two-factor authentication. That means when you log in to your iCloud account you’ll be asked to send a six-digit code to your phone. It’s annoying, but it’s the best way to ensure that your account remains your own.

Don’t use the same password for multiple services. If one of your accounts is hacked or breached, hackers can essentially access all your accounts that used the same password. Make sure to use a different password for your Apple ID and your email account — here’s how to change your Apple ID password and how to check if your password may already be public.

Make sure your password is long, random, and unique. Don’t use your name, birthday, or other common words.

  • Why this matters now

    Over the past few days, the Turkish Crime Family has contacted media outlets saying it has 200 million, 250 million, 519 million, or as many as 750 million Apple ID account credentials culled from breaches of other services.

    The hacking group also said it had been in contact with Apple and was demanding $75,000 in cryptocurrency like bitcoin or $100,000 in Apple gift cards.

    If Apple did nothing, it would “face really serious server issues and customer complaints” in an attack on April 7, a member of the hacking group told Business Insider in an email. They said they were carrying out the attack in support of the Yahoo hacking suspect.

    A report from Motherboard said the group had shown the outlet an email from one of the hackers to an Apple product-security specialist that discussed the ransom demands. That email is fake, a person with knowledge of Apple’s security operations told Business Insider.

    Apple is in contact with law enforcement about the ransom demand, the person said. Apple is unsure if the group’s claims are true, but people at the company say they doubt they are.

    There are other reasons to doubt the hackers’ claims, such as their thirst for publicity and their fluid story.

    But even if the hackers are telling the truth, Apple users can protect themselves by making sure their Apple ID password is unique and hasn’t been revealed in a previous breach.

    “A breach means nothing in 2017 when you can just pull the exact same user information in smaller scales through companies that aren’t as secure,” the group purportedly said in a post on Pastebin in response to Apple’s statement.

    Best thing to do to insure this does not happen to you is “Change Your Passwords”

 

Scary iPhone malware that steals your data is a reminder no platform is ever safe.

Iphone_IOS

If you haven’t done so already, go and update your iPhone, iPad or iPod touch to iOS 9.3.5 right now. To update, go to Settings > General > Software Update.

It may not seem urgent because it’s only a “point release,” but the update is crucial or you risk having all of your data secretly stolen by invisible malware that can install itself on your device and even uninstall itself without leaving any traces behind.

Two reports from the New York Times and Motherboard published on Thursday detail how three major security holes, patched via the update, could be exploited by hackers to track and steal practically all of the private data on your iOS device.

According to both reports, Ahmed Mansoor, a human rights activist from the United Arab Emirates, discovered the vulnerabilities when he received a suspicious text message with a link that would have provided “new secrets about torture of Emiratis in state prisons.”

Had Mansoor clicked on the link, he would have been directed to a website that would have exploited all three security holes and installed malware onto his iPhone, giving remote hackers full access to his device.

Thankfully, Mansoor didn’t click the link. Instead, he alerted Citizen Lab, an interdisciplinary lab based at the Munk School of Global Affairs at the University of Toronto that focuses its research on the intersection of human rights and security.

Citizen Lab identified the link as belonging to NSO Group, an Israel-based “cyberwar” company reportedly owned by American venture capital firm Francisco Partners Management, which sells spyware solutions to government agencies.

Along with additional research from cybersecurity firm Lookout, it has been revealed the three exploits (dubbed “Trident”) are “zero-day” level, meaning the malware kicks in immediately as soon as it’s activated (in this case, once the link is opened, the malware automatically installs itself and starts tracking everything).

“Once infected, Mansoor’s phone would have become a digital spy in his pocket, capable of employing his iPhone’s camera and microphone to snoop on activity in the vicinity of the device, recording his WhatsApp and Viber calls, logging messages sent in mobile chat apps, and tracking his movements,” writes Bill Marczak and John Scott-Railton, two Citizen Lab senior researchers.

According to Lookout, the software is highly flexible and can be configured in a number of ways to target different countries and apps:

The spyware capabilities include accessing messages, calls, emails, logs, and more from apps including Gmail, Facebook, Skype, WhatsApp, Viber, FaceTime, Calendar, Line, Mail.Ru, WeChat, SS, Tango, and others. The kit appears to persist even when the device software is updated and can update itself to easily replace exploits if they become obsolete.

Upon discovery, the two organizations immediately notified Apple and the iPhone maker immediately got to work on iOS 9.3.5, which was released on Thursday.

Though Trident and the type of malware NSO sells (called “Pegasus”) is mainly used by governments to target dissidents, activists and journalists in volatile countries like United Arab Emirates, Mexico, Kenya, Mozambique, Yemen and Turkey, it can be used to target any iOS device.

The very idea of having all your data stolen without any real effort should scare everyone into updating their iOS devices.

As we’ve entrusted our smartphones and tablets with more and more of our personal data, it’s more important than ever to always be running the latest software with the most up-to-date security patches to prevent digital spying and theft.

Quicker to protect iOS than Android

It took 10 days for Apple to release an update to close the holes after Citizen Lab and Lookout alerted the company.

Ten days may seem like a long time, but when you compare it to how long it would take for Android devices to get updated for such a critical patch, it’s like hyper speed.

One of the benefits of iOS is its tightly-integrated software and hardware. Because there are fewer devices and they all run the same core software, Apple can test and deploy security updates quickly and easily with fewer chances of something going wrong.

Android, on the hand, is fragmented into tens of thousands of distinct devices, and customized in too many versions for even the most diehard Android fan to remember. This makes it extremely challenging for phone makers to test and release updates to plug up dangerous security holes quickly.

Google’s Nexus devices are quicker to get software updates because they all run stock Android and Google can push them out in a similar way to Apple. Same goes for Samsung and its Galaxy phones.

But there’s often little incentive for Android phone makers to update their devices. Software maintenance is costly and that’s why you’ll see many Android devices from lesser-known brands either update their phones months or years later or never at all.

No platforms are ever truly secure

The publishing of the security flaws and how serious it could be if you were to fall victim invites another conversation: media portrayal.

Android bears the brunt when it comes to being portrayed as the less secure platform, but as this revelation has revealed, no matter which platform is really more secure, all platforms are susceptible to hackers.

Security is an ongoing and never-ending battle between phone makers like Apple and Google and hackers. It’s a constant cat-and-mouse game where each side is always one step ahead or behind the other.

Had Mansoor not alerted Citizen Lab, the Trident exploit would have continued to exist without anyone knowing. Lookout believes the malware has existed since iOS 7. NSO Group’s Pegasus malware can also be used to target Android and BlackBerry devices, too.

While no platform will ever be truly secure, updating to the latest version of your phone’s software is the best way to remain safe.

 

Police Want to 3D Print a Dead Man’s Fingers to Unlock His Phone

Finger_Print

Asking Apple to help break an iPhone is so three months ago. Police have a new, and higher-tech idea: 3D print the fingers of a dead man and use those fingerprints to unlock the phone instead.

Michigan State University professor Anil Jain—who has been assigned six U.S. patents on fingerprint recognition—told Fusion that police showed up at his lab to ask for help in catching a murderer in an ongoing investigation. They had scans of the victim’s fingerprints from a previous arrest and thought that unlocking his phone (the make and model weren’t divulged) might provide clues as to who killed him.

Jain and his PhD student Sunpreet Arora have already printed all 10 digits using the scans and coated them in a layer of metallic particles to mimic how conducive skin is and make it easier to read. The final 3D-printed fingers aren’t finished, but they’ll be ready for police to try out in a matter of weeks.

It’s possible that the whole move will be futile because many phones that use biometric data require a PIN to be entered if it hasn’t been used in two days. If that’s the case, fingerprint won’t unlock anything.

The legality of this move is still up in the air, but the case is further proof that fingerprints, while cool, are not really the safest way of securing our private data.

Not that it matters for a dead man, but in 2014 a judge ruled that suspects can be required to unlock a phone with a fingerprint.  While the Fifth Amendment protects the right to avoid self-incrimination and makes it illegal to force someone to give out a passcode, biometric indicators like fingerprints are not covered by the Fifth Amendment, according to the ruling.

Maybe it’s time to go back to a 6-8 digit PIN.

Apple hires Encryption Expert to Beef Up Security on its Devices

 

Apple
The FBI and other law enforcement agencies have waged legal war on encryption and privacy technologies.

You may have heard many news stories about the legal battle between Apple and the FBI over unlocking an iPhone that belonged to the San Bernardino shooter. However, that was just one battle in a much larger fight.

Now, in an effort to make its iPhone surveillance-and-hack proof, Apple has rehired security expert and cryptographer Jon Callas, who co-founded the widely-used email encryption software PGP and the secure-messaging system Silent Circle that sells the Blackphone.

This is not Apple’s first effort over its iPhone security.

Just a few months back, the company hired Frederic Jacobs, one of the key developers of Signal — World’s most secure, open source and encrypted messaging application.

Now Apple has rehired Callas, who has previously worked for Apple twice, first from 1995 to 1997 and then from 2009 to 2011.

During his second joining, Callas designed a full-disk encryption system to protect data stored on Macintosh computers.

Apple’s decision to rehire Callas comes after rumors that the company is working on improving the security of its iOS devices in such a way that even Apple can’t hack.

“Callas has said he is against companies being compelled by law enforcement to break into their own encrypted products,” the report reads.

“But he has also said he supports a compromise proposal under which law enforcement officials with a court order can take advantage of undisclosed software vulnerabilities to hack into tech systems, as long as they disclose the vulnerabilities afterward so they can be patched.”

Earlier this year, Apple was engaged in a battle with the US Department of Justice (DoJ) over a court order asking the company to help the FBI unlock iPhone 5C of San Bernardino shooter Syed Farook.

Basically, the company was deliberately forced to create a special, backdoored version of its iOS, so that the FBI may be able to Brute Force the passcode on Farook’s iPhone without losing the data stored in it.

Although Apple refused to do so, and now the Apple wanted to remove its own ability to break its iPhone security in future iPhone models, thereby eliminating the chances for government and intelligence agencies for demanding backdoors.

 

The end of the iPhone encryption case and the questions we must ask

Apple_FBI

It is official. The FBI has accessed the San Bernardino iPhone, and they didn’t need Apple’s help. To quote the court document, found at:

https://assets.documentcloud.org/documents/2778264/Apple-Status-Report.pdf

“Applicant United States of America, by and through its counsel of record, the United States Attorney for the Central District of California, hereby files this status report called for by the Court’s order issued on March 21, 2016. (CR 199.) The government has now successfully accessed the data stored on Farook’s iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court’s Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016. Accordingly, the government hereby requests that
the Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016 be vacated. ”

More questions than I can put down here come to mind, but here are a few:

Was the FBI genuine when it filed initially, claiming they had no way to access the San Bernardino iPhone without Apple’s help?

If they were not genuine, and that seems to be the prevailing view in the technical field, was this behaviour becoming, or acceptable, from law enforcement? The simplified timeline of this case was that the FBI sought their court order, Apple said they would fight it, public opinion turned on the FBI, it appeared the legal argument may not stand up to challenge, the FBI sought a stay in the case while they tested a new way to get into the phone themselves, they then came out with the above statement claiming they have accessed the phone and requested the order be vacated. At face value the fact that the stay was sought when it was seems very convenient.

Since the net result of this exercise has been nothing and worked out as if the FBI never went to court at all, Apple did not render assistance, the FBI got into the phone anyway, no legal precedent was set, was this a good use of taxpayer funds?

Will the FBI tell Apple how they got into the phone? If they won’t on national security grounds, is it acceptable that Apple customers are vulnerable to attacks that can happen in the wild due to some intangible threat that cannot be measured?

Did the FBI find anything of value?

What do dormant cyber pathogens look like?

http://arstechnica.com/tech-policy/2016/03/what-is-a-lying-dormant-cyber-pathogen-san-bernardino-da-wont-say/

It’s important we ask these questions, because if we don’t we run the risk of setting our own precedent, normalising dishonesty, vexatious use of the court system, wasting of taxpayer funds, leaving of the general public unsafe, and the utterance of wild claims, all in the name of national security.

National security should not be doing this to us.

Mobile Forensics Firm to Help FBI Hack Shooter’s iPhone

Terrorist

Israel-based mobile forensics firm Cellebrite is believed to be the mysterious “outside party” that might be able to help the FBI hack the iPhone belonging to the San Bernardino shooter.

Israeli newspaper Yedioth Ahronoth broke the news, which appears to be confirmed by a $15,000 contract signed by the FBI with Cellebrite on March 21, the day when the agency announced that it may have found a way to crack Islamic Terrorist Syed Rizwan Farook’s iPhone without Apple’s help.

The FBI convinced a judge in mid-February to order Apple to create special software that would allow the law enforcement agency to brute-force the PIN on Farook’s iPhone 5C without the risk of destroying the data stored on it.

Apple, backed by several other technology giants, has been preparing to fight the order, which it believes would set a dangerous precedent.

Just as the US government and Apple were about to face each other in court, the FBI announced on Monday that it may no longer need Apple’s help in cracking the phone. Federal prosecutors later cancelled the hearing set for Tuesday, stating that the FBI will be aided by an unidentified “outside party.”

That “outside party” appears to be Cellebrite, which has been working with the FBI since 2013. The company’s website shows that it has assisted law enforcement investigations in several countries over the past period.

“Cellebrite mobile forensics solutions give access to and unlock the intelligence of mobile data sources to extend investigative capabilities, accelerate investigations, unify investigative teams and produce solid evidence,” the company writes on its official site.

Experts have suggested several methods that could be used to gain access to the data on the San Bernardino shooter’s iPhone, including ones involving acid and lasers, but they didn’t appear to be very practical.

After the FBI announced that it might have found a practical alternative, iOS forensics expert Jonathan Zdziarski published a blog post describing some of the likely methods that might be used to accomplish the task.

The expert believes the technique that will be used has likely already been developed, as the FBI says it only needs two weeks to test the proposed method.

Zdziarski believes the company that will aid the FBI will either use a software exploit or a hardware technique known as NAND mirroring.

“This is where the NAND chip is typically desoldered, dumped into a file (likely by a chip reader/programmer, which is like a cd burner for chips), and then copied so that if the device begins to wipe or delay after five or ten tries, they can just re-write the original image back to the chip,” the researcher explained. “It’s possible they’ve also made hardware modifications to their test devices to add a socket, allowing them to quickly switch chips out, or that they’re using hardware to simulate this chip so that they don’t have to.”

“My gut still tells me this is likely a NAND hardware technique. A software exploit doesn’t scale well. I know this because my older forensics tools used them, and it required slightly different bundles for every hardware and firmware combination. Some also work against certain versions, but not against others,” he noted.

Zdziarski believes that if the technique already exists, it has likely been sold privately for well over $1 million.

Obama’s Call for Encryption ‘Compromise’ Is Hypocritical

1457817377711230

Image: screengrab

During his keynote speech at South By Southwest, President Barack Obama addressed the ongoing debate over encryption. Although he declined to discuss the specifics of the San Bernardino case, in which Apple is currently fighting a court order to hack its own device, the president spoke in more general terms about privacy and security. Obama joined several other political figures in calling for the tech industry to enable expanded law enforcement access to encrypted data.

Obama also advocated for the use of encryption by the government, saying that the technology is crucial to preventing terrorism and protecting the financial and air traffic control systems. But the president argued argued that ordinary citizens also need to expect some intrusion into their phones in order to ensure a safe society. Obama compared the weakening of encryption to going through security at the airport—an intrusive process, but a necessary sacrifice for citizens to make. (Obama’s own devices are, of course, secured with strong encryption.) In his speech, Obama said:

So we’ve got two values, both of which are important. And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there’s no key. There’s no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can’t crack that at all, government can’t get in, then everybody’s walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.

Obama said the tech community should “balance these respective risks,” suggesting that the industry had not been proactive enough in compromising on encryption and that, if it failed to compromise, it risks being cut out of the conversation entirely by Congress. “I’m confident that this is something we can solve, but we’re going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it,” Obama said. He added:

Because what will happen is, if everybody goes to their respective corners, and the tech community says, ‘You know what, either we have strong perfect encryption, or else it’s Big Brother and Orwellian world,’ what you’ll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.

In Obama’s telling, the tech industry is painted as a spoiled child who runs back to his corner and disengages with the debate, snatching up his toys and taking them back to his mansion when he realizes he doesn’t like the way the game is being played. It’s a compelling image, and one that the industry, which is widely perceived as elitist and uninclusive, will have a tough time combatting.

But the industry has compromised on this issue, collaborating with law enforcement to provide access to data for criminal prosecutions. In the San Bernardino case, Apple has provided access to iCloud backups of the shooter’s phone and offered suggestions on how to create additional backups before it was revealed that the shooter’s iCloud password had been reset at the behest of the FBI.

Tech companies also routinely provide unencrypted metadata to law enforcement, which can provide a detailed portrait of a suspect’s life: where he’s been, where he is currently, who he communicates with, how regularly he communicates with others and how long the conversations last.

The government also wields a powerful investigative tool in CALEA (the Communications Assistance for Law Enforcement Act). CALEA compels service providers like AT&T and Verizon to build backdoors into their systems to allow for real-time monitoring of suspects by law enforcement.

Yet another instance of compromise is Apple’s encryption of iCloud. As security expert Jonathan Zdziarski pointed out in post on his blog, iCloud offers an example of the type of “warrant-friendly” encryption that Obama called for in his SXSW keynote.

“I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important,” Obama said. His suggestion for solving the encryption debate mirrors the solution Apple has already developed for securing iCloud data: that data is encrypted, but Apple maintains access so that it can comply with warrants.

But, Zdziarski notes, the 2014 hack of celebrities’ iCloud accounts illustrates the dangers of “compromise” encryption.

“The iCloud’s design for ‘warrant friendliness’ is precisely why the security of the system was also weak enough to allow hackers to break into these women’s accounts and steal all of their most private information,” Zdziarski wrote. “The data stored in iCloud is stored in a weaker way that allows Apple to service law enforcement requests, and as direct result of this, hackers not only could get into the same data, but did. And they did it using a pirated copy of a law enforcement tool—Elcomsoft Phone Breaker.”

Obama mentioned this particular concern in his speech. “Now, what folks who are on the encryption side will argue, is any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That’s just the nature of these systems,” he said. “That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.”

Obama is right—it’s technically true that any key can end up being used on every device.

The president isn’t the only politician to call for compromise on encryption and he certainly won’t be the last, but what the FBI is asking for in the San Bernardino case (and beyond it) isn’t compromise—it’s total compliance. Compromise suggests that tech companies and law enforcement agencies will meet in the middle, each conceding some of their demands in order to find common ground. The industry has made an effort to do so by providing metadata, real-time surveillance, and data backups to law enforcement.

But Obama’s comments suggest that none of this information is enough—encryption needs to be completely backdoored in order for there to be “compromise.” If the government refuses to acknowledge the concessions that have been made and continues to demand universal access to encrypted data while clinging onto strong encryption for itself, there is no compromise at all. It’s just the government getting exactly what it wants, snatching up all its toys and heading back to its mansion.

Apple v. FBI: How to Sound Smart about Encryption

Encryption

Apple v. FBI has started a serious debate about the line between security and privacy. The FBI says this is a case about the contents of one specific iPhone 5c. Apple says this is a case about securing data for everyone.

No one seems to want to have a civil, Socratic discussion about what it means to evolve the governance of a digital democracy. Instead, most people want to voice their opinions about terrorism, the law, and Apple. People also want to know if this particular iPhone 5c (or any iPhone) can be hacked, and if offers to hack it from white hat hackers, such as John McAfee, are real.

The Apple v. FBI subject device, an iPhone 5c, can be hacked. This is true because of iOS 8 (the operating system running on the subject device) and the way all iPhone 5c’s were manufactured. Current vintage iPhones (5s, 6, 6s) could not be hacked the same way, so we should not be talking about this particular phone; we should be talking about encryption writ large, and how it is used in our daily lives.

What Is Encryption?

Encryption is the process of using algorithms to encode information with the specific goal of preventing unauthorized parties from accessing it. For digital communication, there are two popular methods of encryption: symmetric key and public key.

  • Symmetric key encryption requires both the sending and receiving parties to have the same key – hence the term “symmetric.”
  • Public key encryption is far more popular because the encryption key is publicly available, but only the receiving party has access to the decryption key.

How Can There Be Such a Thing as a “Public” Encryption Key?

One of the most popular ways to create public encryption keys is to use a mathematical problem known as prime factorization (aka integer factorization). You start with two relatively large prime numbers. (Quick 6th Grade Math Refresher: A prime number is only divisible by 1 and itself.) Let’s call them P and P. When you multiply them, the product is a composite number we’ll call “C.”

(P x P = C)

C is a very special number with very special properties. It’s called a semiprime number. Semiprime numbers are only divisible by 1, themselves and the two prime factors that made them. This special property enables the number to be used for public key encryption.

You use C for the public key and you keep P and P as the private key pair. While it is very easy to generate C, if the number is large enough and thoughtfully generated, it can take thousands, millions or even billions or trillions of tries to factor. (There are mathematical strategies to speed up the process, but in practice, prime factoring must be done by trial and error.)

Pretty Good Privacy, the Encryption We Mostly Use

The OpenPGP standard is one of the most popular versions of public key encryption, aka Pretty Good Privacy or PGP. There is a very good chance that your corporate IT department uses some version of PGP to encrypt your files – after all, it’s pretty good.

How good? Using current computer technology, a 2048-bit OpenPGP encrypted file cannot be decrypted. Someday it might be possible with a fully functional quantum computer, but these are still, for all practical purposes, theoretical devices.

Now, you’re going to push back with an argument that goes something like this: “Hey Michael, you may think that a file encoded with 2048-bit OpenPGP encryption is unbreakable, but you don’t know that for sure. You have no idea what the NSA can or cannot do! How do you know that quantum computers don’t exist? Nothing is impossible!”

Yeah … no. 2048-bit OpenPGP encryption can’t be decrypted without a key because of the way computers work today. In the future, with new hardware and processor and bus speeds that are currently undreamt of, the computation may be able to be done in reasonable time – but not today. Without your private key, the computational time required to break a 2048-bit key in a secure SSL certificate would take over 6.4 quadrillion years.

How Can the “Now Famous” iPhone 5c Be Hacked?

For the iPhone 5c in question, you don’t need to hack the encryption key; you need to “make” the encryption key. It is generated from a combination of the user-created PIN or password and a unique key that Apple embeds in each iPhone 5c when it is manufactured. The FBI is asking Apple to create a new operating system with the ability to disable certain security protocols – specifically to defeat the limit on failed passcode attempts and to remove the delay caused by failed attempts. With this new weaker security protocol and forensic software written to try every possible PIN or password combination, the FBI hopes to regenerate the unique key required to open the phone.

It is important to note that this whole idea is only possible on iPhones older than the 5c running iOS 8 or earlier. iPhones with fingerprint scanners such as the 5s, 6 and 6s use a second processor called “secure enclave.” Even Apple can’t hack an iPhone that includes a secure enclave processor – not without creating a “backdoor.”

This is what Apple is worried about. You should be too. If the government served Apple with a lawful writ or subpoena to deliver the key to an iPhone 6s, it would not be able to comply. This case asks the question, should the government be allowed to compel any company that creates a digital security product to create a “backdoor” and make it available for any reason (lawful or other)?

The important thing about an iOS 9 “backdoor” in Apple’s case is that it could not be guessed or randomly generated; it would have to be an actual file – a metaphorical “skeleton key.” There’s a problem with skeleton keys, even digital ones: they can be copied. Importantly, they can be copied or stolen without the owner’s knowledge. The idea of creating a “skeleton key” defeats the purpose of encrypting it in the first place. If a key exists, it will be copied by both good and bad actors – that’s just a fact of digital life.

So again, I find myself begging you to engage in a civil, Socratic discussion about what kind of future we want to live in. Encryption enables banking (commercial and consumer) and commerce. Without it, our digital lives would be very, very different. How do you want to evolve the governance of our digital democracy? Where is the line between security and privacy? What do we want to ask our lawmakers to do? Hopefully this short story will inspire you to learn more about encryption so you can draw your own conclusions and join this techno-political debate.

This is What the Public Really Thinks About FBI vs. Apple

Apple_FBI

DOJ v. Data Encryption – Public Perception and Communications Lessons

The heated dispute between Apple and the U.S. Department of Justice (DOJ) over the iPhone used by Syed Rizwan Farook before the San Bernardino, California, mass shooting has captured attention across America and the world. While this debate now focuses on one company’s decision, the implications go well beyond the mobile sector and even the whole technology industry. Companies and other organizations of all kinds responsible for managing personal data are concerned and need to be prepared to deal with the controversy’s impact.




To help deepen understanding about this complex issue, Burson-Marsteller, with their sister research firm Penn Schoen Berland, conducted a national opinion survey from February 23-24, 2016. The survey polled 500 General Population respondents (including 230 iPhone users) and 100 National Elites (individuals earning more than $100,000 per year who have college degrees and follow the news), and the results reveal critical communications issues around the fundamental conflict between privacy on the one hand and national security and safety on the other. Here are the key takeaways:

  • Overall awareness is high. Eighty-two percent of the General Population and 88 percent of National Elites have heard about the dispute. The news has gone viral, with people tweeting and posting on Facebook about it and commenting extensively online about news articles.
  •  The FBI should have access to one phone, not all phones. Respondents say the government should not be given a tool that potentially gives it access to all iPhones. Sixty-three percent of the General Population and 57 percent of National Elites say Apple should only provide the FBI with the data from the phone in question, and the tools to do it should never leave Apple’s premises. It is clear the public wants this decided on a case-by-case basis, and respondents do not trust law enforcement and national security agencies to self-police and protect privacy.
  •  The public expects companies to push back if there is the potential to violate privacy. Respondents say they want companies to protect the privacy of their data fully, even when the government is requesting data in the name of law enforcement or national security. A majority (64 percent of the General Population and 59 percent of Elites) says a company’s top obligation is to protect its customers’ data rather than cooperating with law enforcement or national security interests. However, most (69 percent of the General Population and 63 percent of Elites) see the need to compromise on privacy when terrorist threats are involved.
  • How the issue is framed determines public opinion. If the issue is framed as the FBI asking for access to this one phone, 63 percent of the General Population and 57 percent of Elites agree with the FBI position. If the issue is framed as potentially giving the FBI and other government agencies access to all iPhones, Apple’s position prevails overwhelmingly; 83 percent of the General Population and 78 percent of Elites agree Apple should either only grant access to the particular iPhone or refuse the request entirely.
  • Current laws are outdated. This situation reflects a much broader debate about privacy and security that will need to be resolved. About half (46 percent of the General Population and 52 percent of Elites) say current laws are outdated and need to be revised to reflect the changing role of technology in today’s society.

Regardless of the outcome of this current dispute, there is no question it is raising alarms about the state of data privacy. In the aftermath, companies will have to pay increasing attention to the expectations of their customers and consumers. The survey showed people are overwhelmingly concerned with the security and privacy of their digital data, with 90 percent of the General Population and 96 percent of National Elites saying they are very or somewhat concerned about the security and privacy of their personal information online or on their personal electronic devices. The Apple/DOJ dispute appears to be a turning point for all organizations trying to balance the demands of data privacy with national security and law enforcement considerations. The pressures on them are only going to grow.

 

If Amazon were in Apple’s position, would it unlock its cloud for the feds?

Lock
There’s an easy way to protect your data in the cloud.

As Apple continues to resist FBI demands to unlock a terrorist suspect’s phone, it raises a question: What if Amazon Web Services was ordered to provide access to a customer’s cloud? Would AWS hand the data over to the feds?

+MORE AT NETWORK WORLD: Tim Cook issues internal memo on ongoing FBI/iPhone saga | VMware turns to IBM in the public cloud +

Amazon’s terms of service provide us a clue. AWS says it complies with legally binding orders when compelled to do so. Here’s a statement from Amazon’s FAQ on cloud data privacy (which is not written specifically about the Apple-FBI issue):

“We do not disclose customer content unless we’re required to do so to comply with the law or a valid and binding order of a governmental or regulatory body. Governmental and regulatory bodies need to follow the applicable legal process to obtain valid and binding orders, and we review all orders and object to overbroad or otherwise inappropriate ones.”

Most of the time, when ordered to hand over data, Amazon does so. In 2015 AWS received 1,538 subpoenas from law enforcement officials, according to information the company recently began making public. Just over half the time (in 832 cases, or 54% of the time) AWS complied fully with those orders. Another quarter of the time (in 399 cases) Amazon partially responded to the request for information, while in the remaining 20% of cases AWS did not respond to the subpoena.

amazon-subpoenas-100646389-large_idge

For customers who are concerned about Amazon handing over their data to the government, there are protections that can be put in place. “There’s a huge market focused on encrypting data stored in the cloud, and giving the customers the keys,” explains 451 Research analyst Adrian Sanabria. If customers use a third-party encryption service to scramble their data and manage the keys themselves, then even if Amazon did hand over the data to the feds, it would be useless. “Yes, it does sometimes create some issues with flexibility and breaking functionality, but it is there as an option if you want it, and (if done properly) AWS (or the government) can’t decrypt the data,” Sanabria says.

+ MORE ON APPLE: Apple and the FBI will need to compromise, Cisco’s CEO says +

AWS offers multiple different encryption methods, including ones that are built in automatically to some services – like S3, the Simple Storage Service, and others that customers manage themselves, such as the Hardware Security Module (HSM). AWS’s marketplace offers a variety of additional encryption and security services from independent software vendors.

Amazon says that it notifies customers when there’s been a request for their data to be handed over, unless there’s a compelling reason not to do that; for example if its clear the cloud service is being used for an illegal purpose.

AWS is more stringent about not providing other types of information to the government. In the second half of 2015 alone, AWS received 249 “National security requests” but did not comply with any of them. AWS also received 78 requests from non-U.S. entities, the vast majority of which (60) the company did not respond to.

AWS did not respond to a request to comment on this story.

Microsoft Azure basically has the same policy, according to the company’s website, saying “We do not provide any government with direct or unfettered access to your data except as you direct or where required by law.”

Even with all the concern over providers or the government being able to access data, Sanabria estimates that only a minority of cloud users encrypt data and manage their own keys.