Monthly Archives: February 2016

When Ransomware Strikes Should You Pay or Not?

Hacker2015 was a big year for ransomware exploits and it looks like they aren’t slowing down in 2016. Kaspersky reported that Cyrptolocker attacks doubled in 2015, and that a majority of workplace PCs were attacked. The perpetrators of CryptoLocker attacks send Trojans, usually via email, that when set free, infect a user’s PC and encrypt any files it can access. The attacker then demands money, often in the form of bitcoins, to decrypt the locked files. Attackers threaten all sorts of havoc if their demands aren’t meant. As the article in NetworkWorld points out, even if their demands are met, you can’t count on your attackers honoring their part of the bargain.

I recently wrote a blog that covered a new ransomware attack on Hollywood Presbyterian Medical Center. The attackers are asking for 9,000 bitcoins in order to return thousands of patient records intact, which translates into approximately $3.6 million. That’s a steep price tag for any organization, and although typically law enforcement agencies advise victims not to pay, some police departments have started succumbing to the ransom demands. The more dangerous and alarming part is that, according to the FBI, who are working on this case, some attackers aren’t skilled enough to handle the malware they’ve delivered and if that’s the case, the hospital’s data will be lost forever. As the article points out, some criminal coders can mount an attack, but they don’t know how to handle encryption and decryption. Researchers have reported a ransomware strain that unintentionally locked files that can now never be decrypted.

The hospital has not decided whether they will pay the ransom yet, but they are forced to handle all their records manually for the time being. According to cybersecurity experts, ransomware has proven to be a lucrative business with Kaspersky reporting that a hacker group they researched is getting $2.5 million to $10 million for each successful attack.

In the meantime, organizations in every sector, particularly highly regulated industries like healthcare and finance need to increase their security postures as much as possible. Here are some quick tips that could keep you from becoming a victim:

  • Make sure your employees are security aware and not prone to opening unfamiliar emails and attachments. If an email looks suspicious or an offer seems too good to be true, use caution. Also, since cyber criminals are now adept at researching employees via social media, they should approach any unknown senders with caution.
  • Get the technology you need. There are a variety of ways evasive malware can be introduced, including piggybacking on traffic on high hidden ports. If you’re security can’t monitor those ports, you’re asking for trouble.
  • Be sure you update your software and applications as well as your operating system. Criminal hackers often leverage known vulnerabilities in an application or OS that hasn’t been updated.

 

Sneaky PII: What’s Hiding in Your Data?

eDiscovery

It’s no secret that it’s important to remove personally identifiable information (PII) and other privileged information from case data before it’s produced in order to protect it from falling into the wrong hands. The amount of data to be reviewed prior to litigation continues to grow exponentially as more and more ESI enters into discovery requests, and with an increase in data to review, there’s a greater risk of accidentally disclosing PII. As the past few years have shown us, a breach of PII could have major consequences for a corporation or law firm. The problem with PII, however, is that many companies don’t sufficiently protect employee information within their own environments, and because there’s an increasing amount of overlap between employees’ work and personal lives, there are more opportunities for PII to creep up in unexpected places that are easily overlooked.

Think for a second about where you’d expect PII to show up. You’re probably thinking of HR records where employees’ Social Security numbers, addresses, and phone numbers are stored. PII is easy to spot when you’re checking in obvious places like HR files, but personal information can crop up in other places just as easily when data gets collected from a broad range of sources. If an employee has a payroll issue, they might email bank account information or Social Security numbers to the payroll department. Beyond company-related communications, they might even send scanned images of tax documents to their accountant or mortgage applications to buy a new home from their company email address rather than their personal email. If your case requires that you pull company emails between specified dates you might inadvertently collect this information. In addition to emails, employees might use the office scanner for personal documents that they then send from their personal emails – but if that file lives on the company server, it’s at risk of entering into discovery data. If there isn’t a sweep done for extraneous PII, these details will slip through the cracks and leak to opposing counsel. For this reason, it’s absolutely crucial to comb data not just for relevance and privilege, but also for PII.

It’s a slippery slope, not only because PII is ubiquitous and can easily hide in unexpected places, but there are many contributing factors that make it difficult to pin down and at the mercy of human error. While many individuals are sensitive to their own private information, the average person has low awareness of exactly what data constitutes PII and how it can be compromised, meaning they’re probably revealing their company’s and their own private information unknowingly. Even if employees are hyper-aware of sensitive data, PII differs from state to state, so definitions change constantly and new regulations are implemented frequently. What wasn’t sensitive last year might be sensitive this year, and all the information from last year is still sitting on your company servers.

PII laws are complicated and can widely vary depending on which state and country you’re in, so it’s important to have processes in place to help eliminate extraneous data. Arguing for proportionality to narrow the scope of the case will reduce the amount of unnecessary PII gathered, and making use of technology assisted review and the many e-discovery platforms that can quickly find specific data inputs will dramatically reduce the time it takes to comb through files for PII. There are also products that can assist in the identification and exclusion of PII hiding in your case data, including redaction automation tools. While there are many methods for securing PII, redaction is far and away the safest because it removes sensitive information completely. It cannot be recovered or uncoded, so it is really the best way to eliminate risk.

The sticky nature of PII means that security can’t be done on a case-by-case basis. It should be a part of company-wide best practices and have a well-vetted process in place to ensure data is properly protected, not just for individuals but the company as a whole. Implementing security policies and investing in redaction technologies can help you stay compliant and save time, resources, and your reputation.

Former NSA Chief Michael Hayden Sides With Apple, Though Admits ‘No Encryption Is Unbreakable’

iphone-6-plus-event-2014-billboard-650
An attendee demonstrates the new Apple Inc. iPhone 6 Plus after a product announcement at Flint Center in Cupertino, California, U.S., on Tuesday, Sept. 9, 2014. Apple Inc. unveiled redesigned iPhones with bigger screens, overhauling its top-selling product in an event that gives the clearest sign yet of the company’s product direction under Chief Executive Officer Tim Cook.
David Paul Morris/Bloomberg via Getty Images

Tim Cook‘s opinion that Apple should not develop a way to hack into the encrypted phone belonging to one of the San Bernardino shooters has earned an endorsement from an unlikely source, though it comes with a big “but.” Michael Hayden, the former NSA director and CIA chief — so, a bonafide spy guy, told the Wall Street Journal that America is “more secure with unbreakable end-to-end encryption,” calling it a “slam dunk” if you view it in the scope of the “broad health” of the United States.

Hayden said FBI director James Comey‘s demand for Apple to give them a tool to break into Syed Farook’s iPhone is “based on the belief that he remains the main body, and that you should accommodate your movements to the movements of him, which is the main body. I’m telling you, with regards to the cyber domain, he’s not — you are.”

Now for that “but,” which will surely disappoint all the (temporarily pleased) civil libertarians out there. Hayden said that following a setback in the mid-nineties, when the NSA failed to convince manufacturers to adopt a cryptographic device called the Clipper chip, “we then began the greatest 15 years in electronic surveillance.” The controversial chipset was an encryption device that had a built-in backdoor in case the government needed to take a lookie-loo. But, as Hayden notes, “we figured out ways to get around the quote-unquote unbreakable encryption. Number one, no encryption is unbreakable. It just takes more computing power. Number two, the way we worked around encryption is bulk collection and metadata.”

Watch the conversation:

Since 2014, Apple’s iPhones have had built-in encryption that makes it so the contents of a device can only be accessed via a phone’s passcode. The FBI’s order stipulates that Apple provide software to work only on the San Bernardino shooter’s iPhone. Cook said in an open letter that the U.S. government order would undermine encryption and potentially create a “master key, capable of opening hundreds of millions of locks” on private devices.

Cook wrote that “in the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession… The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

On Wednesday, Cook’s position received support from a high-profile colleague in tech.

“Forcing companies to enable hacking could compromise users’ privacy,” wrote Google CEO Sundar Pichai in a series of Twitter posts. “We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders. But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent. Looking forward to a thoughtful and open discussion on this important issue.”

 

Apple Unlocked iPhones for the Feds 70 Times Before

48527578.cached

Apple CEO Tim Cook declared on Wednesday that his company wouldn’t comply with a government search warrant to unlock an iPhone used by one of the San Bernardino killers, a significant escalation in a long-running debate between technology companies and the government over access to people’s electronically-stored private information.

But in a similar case in New York last year, Apple acknowledged that it could extract such data if it wanted to. And according to prosecutors in that case, Apple has unlocked phones for authorities at least 70 times since 2008. (Apple doesn’t dispute this figure.)

In other words, Apple’s stance in the San Bernardino case may not be quite the principled defense that Cook claims it is. In fact, it may have as much to do with public relations as it does with warding off what Cook called “an unprecedented step which threatens the security of our customers.”

For its part, the government’s public position isn’t clear cut, either. U.S. officials insist that they cannot get past a security feature on the shooter’s iPhone that locks out anyone who doesn’t know its unique password—which even Apple doesn’t have. But in that New York case, a government attorney acknowledged that one U.S. law enforcement agency has already developed the technology to crack at least some iPhones, without the assistance from Apple that officials are demanding now.

The facts in the New York case, which involve a self-confessed methamphetamine dealer and not a notorious terrorist, tend to undermine some of the core claims being made by both Apple and the government in a dispute with profound implications for privacy and criminal investigations beyond the San Bernardino case.

In New York, as in California, Apple is refusing to bypass the passcode feature now found on many iPhones.

But in a legal brief, Apple acknowledged that the phone in the meth case was running version 7 of the iPhone operating system, which means the company can access it. “For these devices, Apple has the technical ability to extract certain categories of unencrypted data from a passcode locked iOS device,” the company said in a court brief.

Whether the extraction would be successful depended on whether the phone was “in good working order,” Apple said, noting that the company hadn’t inspected the phone yet. But as a general matter, yes, Apple could crack the iPhone for the government. And, two technical experts told The Daily Beast, the company could do so with the phone used by deceased San Bernardino shooter, Syed Rizwan Farook, a model 5C. It was running version 9 of the operating system.

Still, Apple argued in the New York case, it shouldn’t have to, because “forcing Apple to extract data… absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand,” the company said, putting forth an argument that didn’t explain why it was willing to comply with court orders in other cases.

“This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue,” Apple said.

Apple’s argument in New York struck one former NSA lawyer as a telling admission: that its business reputation is now an essential factor in deciding whether to hand over customer information.

“I think Apple did itself a huge disservice,” Susan Hennessey, who was an attorney in the Office of the General Counsel at the NSA, told The Daily Beast. The company acknowledged that it had the technical capacity to unlock the phone, but “objected anyway on reputational grounds,” Hennessey said. Its arguments were at odds with each other, especially in light of Apple’s previous compliance with so many court orders.

It wasn’t until after the revelations of former NSA contractor Edward Snowden that Apple began to position itself so forcefully as a guardian of privacy protection in the face of a vast government surveillance apparatus. Perhaps Apple was taken aback by the scale of NSA spying that Snowden revealed. Or perhaps it was embarassed by its own role in it. The company, since 2012, had been providing its customers’ information to the FBI and the NSA via the PRISM program, which operated pursuant to court orders.

Apple has also argued, then and now, that the government is overstepping the authority of the All Writs Act, an 18th-century statute that it claims forces Apple to conduct court-ordered iPhone searches. That’s where the “clear legal authority” question comes into play.

But that, too, is a subjective question which will have to be decided by higher courts. For now, Apple is resisting the government on multiple grounds, and putting its reputation as a bastion of consumer protection front and center in the fight.

None of this has stopped the government from trying to crack the iPhone, a fact that emerged unexpectedly in the New York case. In a brief exchange with attorneys during a hearing in October, Judge James Orenstein said he’d found testimony in another case that the Homeland Security Department “is in possession of technology that would allow its forensic technicians to override the pass codes security feature on the subject iPhone and obtain the data contained therein.”

That revelation, which went unreported in the press at the time, seemed to undercut the government’s central argument that it needed Apple to unlock a protected iPhone.

“Even if [Homeland Security] agents did not have the defendant’s pass code, they would nevertheless have been able to obtain the records stored in the subject iPhone using specialized software,” the judge said. “Once the device is unlocked, all records in it can be accessed and copied.”

A government attorney affirmed that he was aware of the tool. However, it applied only to one update of version 8 of the iPhone operating system—specifically, 8.1.2. The government couldn’t unlock all iPhones, but just phones with that software running.

Still, it made the judge question whether other government agencies weren’t also trying to break the iPhone’s supposedly unbreakable protections. And if so, why should he order the company to help?

There was, the judge told the government lawyer, “the possibility that on the intel side, the government has this capability. I would be surprised if you would say it in open court one way or the other.”

Orenstein was referring to the intelligence agencies, such as the NSA, which develop tools and techniques to hack popular operating systems, and have been particularly interested for years in trying to get into Apple products, according to documents leaked by Snowden.

There was no further explanation of how Homeland Security developed the tool, and whether it was widely used. A department spokesperson declined to comment “on specific law enforcement techniques.” But the case had nevertheless demonstrated that, at least in some cases, the government can, and has, managed to get around the very wall that it now claims impedes lawful criminal investigations.

The showdown between Apple and the FBI will almost certainly not be settled soon. The company is expected to file new legal briefs within days. And the question of whether the All Writs Act applies in such cases is destined for an appeals court decision, legal experts have said.

But for the moment, it appears that the only thing certainly standing in the way of Apple complying with the government is its decision not to. And for its part, the government must be presumed to be searching for new ways to get the information it wants.

Technically, Apple probably can find a way to extract the information that the government wants from the San Bernardino shooter’s phone, Christopher Soghoian, the principal technologist for the American Civil Liberties Union, told The Daily Beast.

“The question is, does the law give the government the ability to force Apple to create new code?” he said. “Engineers have to sit down and create something that doesn’t exist” in order to meet the government’s demands. Soghoian noted that this would only be possible in the San Bernardino case because the shooter was using an iPhone model 5C, and that newer hardware versions would be much harder for Apple to bypass.

But even that’s in dispute, according to another expert’s analysis. Dan Guido, a self-described hacker and CEO of the cybersecurity company Trail of Bits, said that Apple can, in fact, eliminate the protections that keep law enforcement authorities from trying to break into the iPhone with a so-called brute force attack, using a computer to make millions of password guesses in a short period of time. New iPhones have a feature that stops users from making repeated incorrect guesses and can trigger a kind of self-destruct mechanism, erasing all the phone’s contents, after too many failed attempts.

In a detailed blog post, Guido described how Apple could work around its own protections and effectively disarm the security protections. It wouldn’t be trivial. But it’s feasible, he said, even for the newest versions of the iPhone, which, unlike the ones in the New York and San Bernardino cases, Apple swears it cannot crack.

“The burden placed on Apple will be greater… but it will not be impossible,” Guido told The Daily Beast.

 

Apple IOS Forensic Primer

iPhoneThe Operating System that Apple licenses to its users is IOS. It is resident and runs on their mobile devices (like the IPOD, IPhone and the IPAD). Legally, Apple specifically states it retains ownership of the IOS. There is legal precedent being argued (by the US DOJ) that will hold Apple to its continued ownership interest in IOS. This means the company can potentially be subpoenaed to assist Law Enforcement in exploitation of software on a target phone (which runs the IOS) in the execution of a search warrant.

While authorities wait for the decision on this particular legal argument, IOS forensics is necessary if the Apple device (in question) has been used in or found to be evidence in a crime. While the DOJ argues the precedent that “a product’s continued ownership interest in a product after it is sold obliges the company to act as an agent of the state”, the administrator needs to be able to pull data off of that device immediately during the conduct of an investigation. Even if an administrator is just trying to see if the user is violating (or has violated) company policy, there is a need to be able to access the data on the device.

There is a lot of data that gets stored on IPhones. Some people have more data on their IPhone than they have on their computers. If you browse the phones hard drive (typically this is done with a phone disk tool) you will not be able to see the full file system but, if you could see it, it bears a strong resemblance to the “MAC OS”. The MAC OS x” is built on a core called “Darwin” and the IPhone has all of the directory structure that the Mac operating system has.

For example, the maximum number of allocation blocks per volume that File Manager can access on a Mac OS system is 65,535. The IOS is basically a “MAC OS” system that has been tuned and tailored to operate on the smaller mobile devices which have different processors in them.

As we examine the directories and analyze their subdirectories, we see what is available as we dig down inside the device. The “DCIM” directory holds the “100 Apple” directory which will show the administrator where all of the pictures are. We also have a downloads directory (which holds all downloads), an iTunes directory (which holds all mp3 files), etc. The significance is that all of these directories give you the ability to see user data on a particular system.

Another place you can go looking for system information is in a terminal window. The terminal window gives an administrator the ability to use the command line interface to examine the device and the device data. Complete device access can be obtained when the “sudo” super user command is invoked. You will type;

$sudo su clyde (The user becomes a super user)

$cd (change directory)

$pwd (Here, we print the working directory)

$/user/clyde (This is our current directory)

The terminal window gives us the ability to examine the data inside the device as a super user (which gives us complete access to the system). When we look inside a device as a super user we know we will have the ability to access all additional files in the system. Instead of looking at the phone itself with different tools, you can analyze the system through a terminal command line.

$cd Library/ (change directory to the Library)
#cd ApplicationSupport/ (Change directory to the ApplicationSupport directory)

$ls (we list the contents of the directory, while we look for the MobileSync directory)

An administrator can examine and analyze the device’s “mobile sync” in relation to the computer the device has been syncing with.

$cd MobileSync (change directory to the MobileSync directory)

$ls (list the contents of the directory)

Backup (this is the contents of the directory)

$cd Backup (Change directory to the backup directory)

$ls (This will list all of the backups in the backup directory)

This is significant because in addition to examining the device data, I can pull up all of the “Backups” and select one of the backups. There is a lot of data stored in the backups. These files are just the backup information that has been stored on the hard drive. When the connected device (whether it is an IPad or IPhone) has its data copied onto the computer, in addition to being able to look at the directory on the phone itself using a utility like “Phone disk”, an administrator could also analyze the data in the backup. If you don’t have the phone but you have the computer, you may have almost as good a set of information as if you did have the phone because the backup stores a lot of information. It has to store all of the information you would need to restore the phone. The backup has got to store everything about your phone that you had previously.

If you have a user’s computer and you find the IPhone backups, you have the information that was stored on the phone. There are utilities that can be used to analyze these IPhone backups which have the ability to extract information from them. This will give an administrator the ability to examine all of the data that was captured in the scheduled backups.

When you are performing IOS forensics, there is not only the question of looking at the phones data because; sometimes an administrator won’t be able to obtain access to the data if the phone has a “Pass Code”. However, if you have access to the backup directory on the computer that the phone “syncs” with, you may have a better chance of getting the data from that device and doing your forensic analysis on the phone while you are actually working on the computer where the backups are stored. This is what eliminates IOS’s ability to thwart administrators and Law Enforcement from performing a forensic analysis.

Read more: Apple IOS Forensic Primer http://www.sooperarticles.com/technology-articles/mobile-computing-articles/apple-ios-forensic-primer-1453263.html#ixzz40dsmaebc
Follow us: @SooperArticles on Twitter | SooperArticles on Facebook

JOHN MCAFEE: I’ll decrypt the San Bernardino phone free of charge so Apple doesn’t need to place a back door on its product

John_McAfeeCybersecurity expert John McAfee is running for president in the US as a member of the Libertarian Party. This is an op-ed article he wrote and gave us permission to run.

Using an obscure law, written in 1789 — the All Writs Act — the US government has ordered Apple to place a back door into its iOS software so the FBI can decrypt information on an iPhone used by one of the San Bernardino shooters.

It has finally come to this. After years of arguments by virtually every industry specialist that back doors will be a bigger boon to hackers and to our nation’s enemies than publishing our nuclear codes and giving the keys to all of our military weapons to the Russians and the Chinese, our government has chosen, once again, not to listen to the minds that have created the glue that holds this world together.

This is a black day and the beginning of the end of the US as a world power. The government has ordered a disarmament of our already ancient cybersecurity and cyberdefense systems, and it is asking us to take a walk into that near horizon where cyberwar is unquestionably waiting, with nothing more than harsh words as a weapon and the hope that our enemies will take pity at our unarmed condition and treat us fairly.

Any student of world history will tell you that this is a dream. Would Hitler have stopped invading Poland if the Polish people had sweetly asked him not to do so? Those who think yes should stand strongly by Hillary Clinton’s side, whose cybersecurity platform includes negotiating with the Chinese so they will no longer launch cyberattacks against us.

The FBI, in a laughable and bizarre twist of logic, said the back door would be used only once and only in the San Bernardino case.

Tim Cook, CEO of Apple, replied:

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

Tim_Cook

No matter how you slice this pie, if the government succeeds in getting this back door, it will eventually get a back door into all encryption, and our world, as we know it, is over. In spite of the FBI’s claim that it would protect the back door, we all know that’s impossible. There are bad apples everywhere, and there only needs to be in the US government. Then a few million dollars, some beautiful women (or men), and a yacht trip to the Caribbean might be all it takes for our enemies to have full access to our secrets.

Cook said:

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The fundamental question is this: Why can’t the FBI crack the encryption on its own? It has the full resources of the best the US government can provide.

With all due respect to Tim Cook and Apple, I work with a team of the best hackers on the planet. These hackers attend Defcon in Las Vegas, and they are legends in their local hacking groups, such as HackMiami. They are all prodigies, with talents that defy normal human comprehension. About 75% are social engineers. The remainder are hardcore coders. I would eat my shoe on the Neil Cavuto show if we could not break the encryption on the San Bernardino phone. This is a pure and simple fact.

And why do the best hackers on the planet not work for the FBI? Because the FBI will not hire anyone with a 24-inch purple mohawk, 10-gauge ear piercings, and a tattooed face who demands to smoke weed while working and won’t work for less than a half-million dollars a year. But you bet your ass that the Chinese and Russians are hiring similar people with similar demands and have been for many years. It’s why we are decades behind in the cyber race.

gettyimages-136135710

Cyberscience is not just something you can learn. It is an innate talent. The Juilliard School of Music cannot create a Mozart. A Mozart or a Bach, much like our modern hacking community, is genetically created. A room full of Stanford computer science graduates cannot compete with a true hacker without even a high-school education.

So here is my offer to the FBI. I will, free of charge, decrypt the information on the San Bernardino phone, with my team. We will primarily use social engineering, and it will take us three weeks. If you accept my offer, then you will not need to ask Apple to place a back door in its product, which will be the beginning of the end of America.

If you doubt my credentials, Google “cybersecurity legend” and see whose name is the only name that appears in the first 10 results out of more than a quarter of a million.

Edward Snowden defends Apple in fight against FBI

Edward Snowden — the ex-NSA contractor who started this whole privacy debate — has joined the ranks of Apple defenders.
Snowden

On Tuesday, a federal magistrate-judge ruled that Apple must help the FBI break into the phone of one of the San Bernardino shooters. The FBI was unable to figure out the shooter’s passcode, which is the only way to get inside his iPhone.

Apple CEO Tim Cook is furious, saying that the U.S. government is trying to undermine the security of its flagship product.

“The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers,” Cook said.

Apple plans to fight the decision, aided by the ACLU.

On Wednesday, the divide was clear: politicians versus engineers.

“The FBI is creating a world where citizens rely on Apple to defend their rights, rather than the other way around,” Snowden said Wednesday morning on Twitter.

Late Wednesday, Silicon Valley’s powerful tech industry trade group came out in support of Apple too.

“We worry about the broader implications … of requiring technology companies to cooperate with governments to disable security features, or introduce security vulnerabilities,” said the Information Technology Industry Council, which represents Dell, Facebook (FB, Tech30), Google, Hewlett Packard (HPE, Tech30), IBM (IBM, Tech30), Microsoft (MSFT, Tech30), Nokia (NOK) and others.

For years, the FBI has demanded special access into smartphones. Tech companies have refused, instead increasing the security of their customers’ data.

Cryptographers, the scholars who build security into technology, have unanimously warned that special access is a dangerous idea. To them, this isn’t about security competing with privacy. It’s just about security.

The San Bernardino shooter, Syed Farook, used an iPhone 5C. The FBI has been trying to guess his passcode to unlock it. If they guess wrong 10 times, Farook’s iPhone will permanently erase all the data stored inside.

Apple doesn’t hold the keys to his device. But the FBI wants Apple to create a special version of its iOS software that will get loaded onto the phone, circumvent Apple’s security features and let agents hack it.

Dan Guido, who runs the cybersecurity firm Trail of Bits, explained in a blog post Wednesday that this hack is possible. He said it would work on any iPhone 5C or older model, putting them “at risk when they’re confiscated by law enforcement around the world.”

Last year, the world’s top cryptographers issued a joint paper saying this is a bad idea. CNNMoney asked them if this particular San Bernardino case changes their mind. All seven who responded said no.

Matthew Green, who teaches cryptography and computer security at Johns Hopkins University, fears it’s a slippery slope. If Apple complies with the government this time, it’ll be forced to in the future.

“I haven’t seen any guiding principle that would prevent this from getting out of hand. It could easily result in every American becoming less secure,” he said.

Columbia University computer science professor Steven M. Bellovin said that if Apple doesn’t resist the FBI, it’ll soon face the same pressure from authoritarian and repressive governments like China.

“This makes it much easier for others — other police departments, other governments — to demand the same thing,” he said.

Bruce Schneier, one of the world’s top cryptographers, warned that criminals could also use this kind of special access to break into people’s phones to steal messages, photographs and other personal information. If Apple creates a weaker version of its operating system, others will get their hands on it.

Most tech industry executives — who normally tout privacy — remained silent Wednesday. WhatsApp cofounder Jan Koum stood out with this message on Facebook: “We must not allow this dangerous precedent to be set.”

U.S. Senator Ron Wyden of Oregon, one of the few politicians to rise to Apple’s defense, said “no company should be forced to deliberately weaken its products.”

(Read more: Manhattan DA says Apple makes terrorism cases ‘go cold’)

Other politicians pushed back on that idea Wednesday. White House Press Secretary Josh Earnest told reporters that the FBI is “not asking Apple to redesign its product or create a new backdoor to one of their products. They’re simply asking for something that would have an impact on this one device.”

Leading Republican presidential candidate Donald Trump weighed in too, saying, “we have to open it up.” Marco Rubio, who is also vying for the Republican presidential nomination, said Apple should give up its fight and be “a good corporate citizen.”

But even those who support the FBI’s demands say it’s a point of no return. Cyrus Walker teaches at the government-funded Cyber Defense Analysis Center, where he trains federal agents and police how to hack smartphones in criminal cases.

“If Apple demonstrates the ability to get around its own security countermeasures, that bell is rung and can’t be un-rung,” said Walker.

Google CEO Sides With Apple And Tim Cook, Opposes FBI’s Demand For iPhone Backdoor

apple-googleGoogle’s CEO Sundar Pichai has joined a number of other high profile individuals in expressing his opinions on FBI’s request for Apple to provide backdoor access to an iPhone 5c that forms part of the San Bernardino shooting case. A federal judge has ruled that Apple must indeed assist law enforcement in granting access to a seized iPhone 5c that belonged to one of the shooters accused of killing 14 individuals in California. Commenting on the situation via the use of social media, Sundar Pichai called it a “troubling precedent”.

If you weren’t privy to the whole situation, then it’s probably worth noting that Apple’s CEO Tim Cook almost instantly responded to the ruling with a public and open message to Apple’s customers. In addition to providing a little insight into the ruling and how it came about, Cook also took the opportunity to inform the customers that Apple would be contesting the ruling, claiming that the FBI essentially wants Apple’s engineers to create a new version of iOS that comes with the ability to circumvent very specific security features (read: backdoor access). Cook clearly doesn’t want to have to build in a backdoor to the iPhone or iPad.



Google’s CEO didn’t instantly get involved in the situation, but has since posted a series of tweets which show that he sides with Tim Cook and Apple as a whole. Most notably, Pichai’s five tweets on the predicament claimed Apple’s acceptance of the ruling, if that was indeed the company’s stance, “could compromise a user’s privacy”. He also stated publicly that acceptance of a ruling to provide access to data based on valid legal order is “wholly different than requiring companies to enable hacking of customer devices & data”. It’s difficult to disagree with those views.

Of course, not everyone weighing in with an option on the San Bernardino iPhone situation is fully accepting of Apple’s stance on the ruling. Republic candidate, and general worldwide laughing stock, Donald Trump, predictably doesn’t agree with Tim Cook’s decision to resist the order, stating that he agrees “100 percent with the courts” and calling Apple “Who do they think they are?”.

We’re pretty sure that the public backing of a fellow CEO in the position of Pichai carries a whole lot more importance than the negativity of Mr. Trump.

You can follow us on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple and the Web.

The FBI vs. Apple – thoughts and comments

A_Line_In_The_SandNot trying to provide the full story here, just a few thoughts and directions as to security, privacy and civil rights. (for the backdrop – Apple’s Tim Cook letter explains it best: https://www.apple.com/customer-letter/)

From a technical perspective, Apple is fully capable to alleviating a lot of the barriers the FBI is currently facing with unlocking the phone (evidence) in question. It is an iPhone 5C, which does not have the enhanced security features implemented in iPhones from version 5S and above (security enclave – see Dan Guido’s technical writeup here: http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/).

Additionally, when dealing with more modern versions, it is also feasible for Apple to provide updates to the security enclave firmware without erasing the content of the phone.

But from a legal perspective we are facing not only a slippery slope, but a cliff as someone eloquently noted on twitter. Abiding by a legal claim based on an archaic law (All Writs act – originally part of the Judiciary act of 1789) coupled with just as shaky probably cause claim, basically opens up the door for further requests that will build up on the precedent set here if Apple complies with the court’s order.
One can easily imagine how “national security” (see how well that worked out in the PATRIOT ACT) will be used to trump civil rights and provide access to anyone’s private information.

We have finally reached a time where technology, which was an easy crutch for law enforcement to rely on, is no longer there to enable spying (legal, or otherwise) on citizens. We are back to a time now where actual hard work needs to be done in order to act on suspicions and real investigations have to take place. Where HUMINT is back on the table, and law enforcement (and non-LE forces) have to step up their game, and again – do proper investigative work.

Security is obviously a passion for me, and supporting (and sometimes helping) it advance in order to provide everyone with privacy and comfort has been my ethics since I can remember myself dealing with it (technology, security, and privacy). So is national security and the pursuit of anything that threatens it, and I don’t need to show any credentials for either.

This is an interesting case, where these two allegedly face each other. But it’s a clear cut from where I’m standing. I’ve said it before, and I’ll say it again: Tim Cook and Apple drew a line in the sand. A very clear line. It is a critical time now to understand which side of the line everybody stands on. Smaller companies that lack Apple’s legal and market forces, which have bent over so far to similar “requests” from the government can find solace in a market leader drawing such a clear line. Large companies (I’m looking at you Google!) should also make their stand very clear – to support that line. Crossing that line means taking a step further towards being one of the regimes we protect ourselves from. Dark and dangerous ones, who do not value life, who treat people based on their social, financial, racial, gender, or belief standing differently. That’s not where or who we want to be.

Or at least I’d like to think so.

FBI wants $38 million in new funding to break encryption

The funding bid will help the agency “develop and acquire tools” that break encryption.

FBI-large

The FBI is looking to spend an additional $38.3 million in the coming year to “counter the threat” of encryption.

That’s on top of $31 million already spent on the initiative, according to the agency’s fiscal 2017 budget request published earlier this week by the Justice Department.

The budget request will not be used to hire any new staffers on top of the 39 staffers (including 11 agents), but will be used to “develop and acquire tools for electronic device analysis, cryptanalytic capability, and forensic tools.”

In other words: the feds want access to your encrypted communications, and it’s willing to throw money at doing exactly that.

According to the document, the additional funding will “counter the threat of Going Dark, which includes the inability to access data because of challenges related to encryption, mobility, anonymization, and more.”



8 Ways Technology Is Improving Your Health

The FBI refers to “going dark” as a metaphor for not being able to read the communications and messages of suspected criminals and terrorists.

The FBI did not immediately respond to a request for comment asking what exactly the combined $69.3 million on anti-encryption efforts would entail.

The FBI is known to buy exploits from private intelligence companies, like the Milan, Italy-based Hacking Team, which last year was hit by hackers who leaked documents detailing the company’s work and global government partners.

Encryption, and other privacy tools are increasingly troublesome for the agency, something FBI director James Comey has repeatedly claimed in the past year.

The U.S. government is crying foul over Apple and Google’s efforts to bolster smartphone encryption. Because accusations that they’re going “beyond the law” goes both ways.

The agency chief has been on a tear trying to convince lawmakers and technology giants alike that locking the agency out is making it harder to catch criminals, despite reports suggesting the complete opposite.

Comey’s anti-encryption rhetoric intensified after Apple rolled out encryption in its iPhones and iPads in iOS 8, thought to be in response to claims in documents leaked by whistleblower Edward Snowden that said Apple was a participant in the notorious PRISM surveillance program. In doing so, Apple put encryption in the hands of its users, cutting even itself out of the loop, which riled the FBI which would regularly ask for the company’s help in unlocking criminals’ phones.

The bump in funding comes as the agency continues to realign its efforts to keep ahead of the technological curve.

The document also said the agency would spend an additional $85.1 million on its cyber offensive and defensive operation.

“The FBI will obtain updated and sophisticated IT hardware, IT software, and contractors to expand the foundation of its offensive and defensive operations,” the report said.