Category Archives: eDiscovery

John McAfee Reveals To FBI, On National TV, How To Crack The iPhone (RT Interview)

YouTubeYes, it has gotten this bad. In language simple enough for even a child to understand, John McAfee explains for the world and for the FBI how to hack…

Not as easy as John says, but it can be done !!!

Actually about that encryption. What’s the key? Salt of the key depends on unique device ID. Another part of the key must depend either on the fingerprint ID (which is easy enough, you don’t need the guy alive to get his fingerprints, people even leave fingerprints everywhere), or on a 4-digit PIN. Once you have code injection and can hack out the try counter and have a more direct path to inject the PIN numbers into the key generation algorithm, you can brute force them in a matter of minutes.

Encrypt or not to encrypt?

encryption-debate_Tresorit_Istvan-Lam08Overview of and comments on backdoors, frontdoors and the debate about it

To me, privacy means that I can decide to keep my data private, and neither an NSA or government agent, nor a Facebook/Dropbox/Google employee can see what is in there, if I don’t want it. This concept of privacy is not compatible with any kind of ‘doors’ – front, back or other – to user data.

Since UK Prime Minister David Cameron suggested earlier this year to ban encryption, policy debates intensified in the US, EU, UK and elsewhere about back- or frontdoors built into encryption systems. Certain parties argue that they need front- or backdoors to access tech companies’ data, to prevent and fight criminal activity. But both backdoors and frontdoors violate end user privacy and if that wouldn’t be enough reason against these doors, they also undermine the world’s overall cybersecurity. Let’s see why.

What are frontdoors and backdoors? – definition and examples

First, let’s get two expressions straight: what is a backdoor and what is a frontdoor.

A backdoor is a covert way to provide an entity with a higher level access to a system than what it should normally have. A backdoor is usually hidden as random security bug, but instead of being an accidental mistake, it is planted intentionally. The key thing is that backdoor is hidden, even from the system operator, which makes them uncontrollable and hence, dangerous. Someone who is not supposed to do so, can exploit them.

A frontdoor is a way to give higher access to a system, but it in a way that it is known to the participants or at least by the system operator. It is also assured, that only that entity can use the frontdoor. This is like a master key in a hotel for the maid.

Snowden uncovered several secret operations of the NSA, and that started the current debate on encryption backdoors and frontdoors. NSA director Michael S. Rogers is argues for “front doors with a big lock”, meaning that in case of an investigation, the FBI or other authorities should have a legal and technical way, to access encrypted content. Washington Post created a graphic about the proposal – basically, White House is considering two platforms, one where the authorities can recover encrypted data using a key escrow, and another, where the recovery key is split between platform vendor and the authority. In my view, neither of the proposed options provides sufficient solution. Especially, that they do not guarantee non-US citizens, that they are not monitored by (for them) a foreign government.

If you are new to the security industry, this debate might sound new, but the NSA has a long track record: Der Spiegel reported in 1996 that a swiss national pride, Crypto AG placed backdoors in their renowned crypto machines, due to pressure from the NSA. Another, more recent example is the SP 800-90A standard proposal: researchers suspected that the NSA might have included a backdoor in one of the newly standardized pseudo random generator (namely, Dual_EC-DRBG) . This backdoor could’ve enabled the NSA to monitor anybody, regardless of their citizenship or if they are using a strong encryption algorithm or not.

Also, we should not forget to mention the Gemalto SIM encryption key database hack: a joint effort by the NSA and the British GCHQ. To understand why this action is controversial, we need to understand how the GSM (and 4G/3G) network works. The SIM card stores a symmetric encryption key, which is used to encrypt the traffic in the air. Due to the nature of symmetric ciphers, the same key need to be used to decrypt the content by the GSM core network. For that reason, the SIM keys are stored in a secure, central database, called Home Location Register, or HLR. HLRs are under the jurisdiction of the geolocal authority. That means, the NSA already had a sort of control over the domestic encryption keys, as a default. But then why did they need to hack a respected vendor? Because it enabled them to get any user’s data, without leaving a single mark. The former was actually admitted by General Keith Alexander in his keynote speech at Black Hat conference in 2013, while he denied any covert domestic content monitoring.

These things all undermine the credibility of intelligence agencies, and in general, triggers sometime unfounded suspicion. Not all secret operations are necessarily evil: after DES was introduced by IBM in the early 1970s, which later became the predominant block cipher in the industry, NSA tweaked its structure. NSA lowered the key size, and changed the deep structure (the S-boxes) of it without explanation. Many believed that NSA planted some backdoors, but it turned out later, that the change actually increased the security of DES: NSA already discovered possible attacks, and prepared against it.

All in all, backdoors in crypto systems are not recent inventions, we have seen several suspicious activities by government agencies throughout the past decades. Let me explain why backdoors and frontdoors are bad.

Why backdoors and frontdoors are bad? – the objective technical reasons

It’s not only ethical, philosophical and political problems that are involved with backdoors and frontdoors. There are also several technical reasons why it is extremely difficult to accomplish exceptional governmental access, without insecuring the whole Internet. A recent MIT report by respected security scientists mentions quite a few challenges that a general governmental “frontdoor” would have to face. They state that introducing any frontdoors to encryption systems in rush could lead to a disaster without proper specification and proper system design.

Our world increasingly relies on a trustworthy connection through the internet: individuals and business are banking online, companies transfer crucial business data through this network, governments communicate with their citizens online and so on. Due to this high economic dependence, we need to protect whatever goes through. The Internet could become so widespread, because it adapted to the arising security challenges step by step. Frontdoors and backdoors would undermine its security and would zero the work has been done so far.

There are 4 main technical issues with backdoors and frontdoors:
1.New protocols: The installation of frontdoors & backdoors requires complete new security protocols, new research and development
2.Non-immune governmental agencies: Government agencies are not immune to attacks. Imagine the risk of a terrorist hacking a government agency and gaining access to all data about the US population.
3.National governments versus global citizens: In our globalized world, who would decide which government has the frontdoor?
4.High costs and uncertain results: A system that provides governmental frontdoors is complex and expensive. Who will take the bill of that cost?

1.New protocols: The installation of frontdoors & backdoors requires complete new security protocols, new research and development. The current security systems have been designed in a way that there is no exceptional access in the system. And more or less, they have been functioning OK so far. Forward secrecy is a good example: without this solution if any time in the future any party is compromised, all traffic could be decrypted. Current security protocols are not the best, but with backdoors, most of the accomplishments, like forward security, would be ruined. Also, a new protocol that includes frontdoors needs to be analyzed thoroughly before implementation – it may take years. We’ve seen that most ad-hoc, non-analyzed protocols were cracked later on, just remember WEP, the Wi-Fi encryption.
2.Non-immune governmental agencies: The assumption that a governmental agency is unhackable or not vulnerable is naïve, and proven to be wrong. Its employees are humans too: they can quit, gossip, can be bribed or worse. Just think about Snowden: he walked away with a bunch of classified information. A few years ago, John Anthony Walker, a US officer was convicted of spying for the Soviet Union for almost 20 years: between 1968 and 1985. No organization is unhackable: embarrassingly, even Hacking Team, a government supplier of surveillance and tracking software was hacked in 2015. Damages can be major: in the recent breach of the US Office of Personnel Management, 21.5M social security numbers of government personnel were leaked. If some organization had a frontdoor to all the communication over the internet, a breach would mean a breach of the entire Internet – a breach nobody saw before.
3.National governments versus global citizens: In our globalized world, who would decide which government has access to users’ data? Or is this only a privilege of the NSA? If the NSA has access to users’ data, wouldn’t China or Russia have the right to claim the same? If you are a US citizen , who is working on a project in Europe, should the European government has access to all your personal data? And what if you are working in China or Russia? And what if you are not just in Europe for a short project, you are actually living there as an expat? If you say no to any of those questions, then why should US government have right to access to any foreign citizen’s data? I know these are provocative questions. But in our globalized world, people are working, buying and living in multiple countries. International trading could be completely killed by introducing frontdoor requirements on country level: a US company with factories in Pakistan, suppliers in China and retailers in the EU would have to trust all those governments, because if backdoors and frontdoors were implanted, they would all have the rights to access their confidential business data. .
4.High costs and uncertain results: Digital Right Management (DRM) systems are good examples for how a key management at a global scale can go wrong. Hollywood and the publishing industry has been trying to introduce a proper Digital Right Management platform to prevent piracy, without a breakthrough yet. The similarities between DRM and the frontdoors are the following:
•Both require complex cryptographic key management, as the content in DRM is encrypted or at least scrambled a bit.
•Key management needs to have a global scale, without any exception: if a title is published without DRM in a small number, it can make pirates to copy and distribute that exception.
•The key management is actually implemented by vendors, who are not having interest to make it right; e.g. a DVD player vendor is not incentivized by properly protecting the DRM key. At the same time, those vendors are under serious competitive cost pressure.

Despite the billions of dollars and many years of research, all DRM systems has been cracked so far, just think about DVD: pirates have found the weaknesses and the way around. In case of any frontdoor technique the stakes are much-much higher, so it would be really motivating to many criminal hackers. We also have less experience to defend these systems than in case of DRM, so any leak can be disastrous to all industries, not just “some” revenue loss for the publishing industry.

Conclusion:

So the answer to the question in the title is “yes, let’s encrypt”. I think encryption is crucial from multiple perspectives: security is important for the Internet ecosystem, and weakening that security can be a complete backfire for our freedom, economy and personal security. Also, any backdoor and frontdoor plans raise political, philosophical and ethical questions which leads to a debate, that I think no one wants to take on. Legislative authorities try to address to these new issues, but if different countries take a different direction to this, it will undermine the potential growth of the global economy and the Internet ecosystem.

This is What the Public Really Thinks About FBI vs. Apple

Apple_FBI

DOJ v. Data Encryption – Public Perception and Communications Lessons

The heated dispute between Apple and the U.S. Department of Justice (DOJ) over the iPhone used by Syed Rizwan Farook before the San Bernardino, California, mass shooting has captured attention across America and the world. While this debate now focuses on one company’s decision, the implications go well beyond the mobile sector and even the whole technology industry. Companies and other organizations of all kinds responsible for managing personal data are concerned and need to be prepared to deal with the controversy’s impact.




To help deepen understanding about this complex issue, Burson-Marsteller, with their sister research firm Penn Schoen Berland, conducted a national opinion survey from February 23-24, 2016. The survey polled 500 General Population respondents (including 230 iPhone users) and 100 National Elites (individuals earning more than $100,000 per year who have college degrees and follow the news), and the results reveal critical communications issues around the fundamental conflict between privacy on the one hand and national security and safety on the other. Here are the key takeaways:

  • Overall awareness is high. Eighty-two percent of the General Population and 88 percent of National Elites have heard about the dispute. The news has gone viral, with people tweeting and posting on Facebook about it and commenting extensively online about news articles.
  •  The FBI should have access to one phone, not all phones. Respondents say the government should not be given a tool that potentially gives it access to all iPhones. Sixty-three percent of the General Population and 57 percent of National Elites say Apple should only provide the FBI with the data from the phone in question, and the tools to do it should never leave Apple’s premises. It is clear the public wants this decided on a case-by-case basis, and respondents do not trust law enforcement and national security agencies to self-police and protect privacy.
  •  The public expects companies to push back if there is the potential to violate privacy. Respondents say they want companies to protect the privacy of their data fully, even when the government is requesting data in the name of law enforcement or national security. A majority (64 percent of the General Population and 59 percent of Elites) says a company’s top obligation is to protect its customers’ data rather than cooperating with law enforcement or national security interests. However, most (69 percent of the General Population and 63 percent of Elites) see the need to compromise on privacy when terrorist threats are involved.
  • How the issue is framed determines public opinion. If the issue is framed as the FBI asking for access to this one phone, 63 percent of the General Population and 57 percent of Elites agree with the FBI position. If the issue is framed as potentially giving the FBI and other government agencies access to all iPhones, Apple’s position prevails overwhelmingly; 83 percent of the General Population and 78 percent of Elites agree Apple should either only grant access to the particular iPhone or refuse the request entirely.
  • Current laws are outdated. This situation reflects a much broader debate about privacy and security that will need to be resolved. About half (46 percent of the General Population and 52 percent of Elites) say current laws are outdated and need to be revised to reflect the changing role of technology in today’s society.

Regardless of the outcome of this current dispute, there is no question it is raising alarms about the state of data privacy. In the aftermath, companies will have to pay increasing attention to the expectations of their customers and consumers. The survey showed people are overwhelmingly concerned with the security and privacy of their digital data, with 90 percent of the General Population and 96 percent of National Elites saying they are very or somewhat concerned about the security and privacy of their personal information online or on their personal electronic devices. The Apple/DOJ dispute appears to be a turning point for all organizations trying to balance the demands of data privacy with national security and law enforcement considerations. The pressures on them are only going to grow.

 

For Your Eyes Only: Experts Explore Preventing Inadvertent Disclosures During Discovery

The Altep, kCura and Milyli webinar explored best practices for safeguarding information, as well as technological tools for redaction

There may be a number of “Scott’s” in Chicago, but there are fewer with a specific last name attached, and there is only one with that specific Social Security Number. This information – or a telephone number, or a fingerprint, or even the MAC address of a computer – can be used to identify and verify a person.

But of course, for as valuable as personally identifiable information (PII) may be for you, it’s just as valuable to a malicious actor looking to steal and utilize it for nefarious purposes. That’s why, when conducting discovery, protecting that information should be of the utmost importance for organizations, law firms, and discovery vendors.

Three of those legal technology companies joined together to put that security forth in a recent webinar called“How to Prevent the Disclosure of PII.” The webinar’s panel included Hunter McMahon, vice president of legal and consulting services, Altep; Scott Monaghan, technical project manager, Milyli; Aileen Tien, advice specialist, kCura; and Judy Torres, vice president of information services, Altep.

In order to prevent disclosure, the panelists asked one important question: What exactly is PII? “It really comes down to what information can identify you as an individual,” McMahon said. This includes information that can be categorized into different categories based on how specific and how personal it is , leading McMahon to notenote that data holders should examined PII to determine if it is sensitive, private, or restricted.

When examining PII in the system, it’s also important to examine what regulations and laws the PII falls under. This can include a number of different federal regulations, HIPAA/HITECH (health PII), GLBA (financial PII), Privacy Act (PII held by Federal Agencies), and COPPA (children’s PII). Forty-seven states also have their own information laws, including varying guidelines on breach notification, level of culpability, and more.

Once that information is known, said the panelists, those conducting discovery should turn to the next question: What are the processes in place to protect the data? “Documents that are in the midst of discovery are really an extension of your retention policy… so you have to think about that risk the same way,” McMahon noted.

Torres explained that the proper approach to take to PII is that it will always be in a document set, if it seems unlikely that PII exists in a system. For example, she said not to assume that because a data set concerns only documents accessed during work hours, it will not contain PII.

“Most people, when they’re working, are also working the same time as those people they need to send documents to,” Torres explained. In one case, looking at data from Enron’s collapse, the documents in the case contained 7500+ instances of employee PII, including that of employee’s spouses and children, as well as home addresses, credit card numbers, SSN, and dates of birth.

In order to combat this data lying in the system, it’s important to take a proactive approach, the panel said. “The approach is much like data security in that it’s not going to be perfect, but you can help reduce the risk,” McMahon added.

To protect it in review, those conducting discovery can limit access to documents with PII, limit the ability to print, and limit the ability to download native files. Likewise, teams can employ safeguards during review such as training review teams on classifications of PII, training reviewers on PII workflow, implementing a mechanism for redaction and redaction quality control, and establishing technology encryption.

And even if not using human review, abiding these protocols can be important, “I see such a trend of more cases using assisted review, so you’re not necessarily having human eyes on every document. So it makes sense to make our best effort to protect PII on documents that may not necessarily have human review,” Torres said.

Properly conducting redactions to make sure nothing is missed can be a pain for reviewers as well, but Tien walked the webcast’s viewers through an introduction of regular expressions (reg-ex), one of the most common technology tools for PII redaction. In short, reg-ex is a pattern searching language that allows one to construct a single search string to search for a pattern of characters, such as three numbers, or three letters.

For one example, Social Security Numbers have a very specific format: XXX-XX-XXXX. Reg-ex can be used to find all constructions of this type, using an input like the following: [0-9]{3} – [0-9]{2} – [0-9]{4}

“With practice, you’ll be able to pick this up like any foreign language,” Tien said.

See post Sneaky PII: What’s Hiding in Your Data?

Sneaky PII: What’s Hiding in Your Data?

eDiscovery

It’s no secret that it’s important to remove personally identifiable information (PII) and other privileged information from case data before it’s produced in order to protect it from falling into the wrong hands. The amount of data to be reviewed prior to litigation continues to grow exponentially as more and more ESI enters into discovery requests, and with an increase in data to review, there’s a greater risk of accidentally disclosing PII. As the past few years have shown us, a breach of PII could have major consequences for a corporation or law firm. The problem with PII, however, is that many companies don’t sufficiently protect employee information within their own environments, and because there’s an increasing amount of overlap between employees’ work and personal lives, there are more opportunities for PII to creep up in unexpected places that are easily overlooked.

Think for a second about where you’d expect PII to show up. You’re probably thinking of HR records where employees’ Social Security numbers, addresses, and phone numbers are stored. PII is easy to spot when you’re checking in obvious places like HR files, but personal information can crop up in other places just as easily when data gets collected from a broad range of sources. If an employee has a payroll issue, they might email bank account information or Social Security numbers to the payroll department. Beyond company-related communications, they might even send scanned images of tax documents to their accountant or mortgage applications to buy a new home from their company email address rather than their personal email. If your case requires that you pull company emails between specified dates you might inadvertently collect this information. In addition to emails, employees might use the office scanner for personal documents that they then send from their personal emails – but if that file lives on the company server, it’s at risk of entering into discovery data. If there isn’t a sweep done for extraneous PII, these details will slip through the cracks and leak to opposing counsel. For this reason, it’s absolutely crucial to comb data not just for relevance and privilege, but also for PII.

It’s a slippery slope, not only because PII is ubiquitous and can easily hide in unexpected places, but there are many contributing factors that make it difficult to pin down and at the mercy of human error. While many individuals are sensitive to their own private information, the average person has low awareness of exactly what data constitutes PII and how it can be compromised, meaning they’re probably revealing their company’s and their own private information unknowingly. Even if employees are hyper-aware of sensitive data, PII differs from state to state, so definitions change constantly and new regulations are implemented frequently. What wasn’t sensitive last year might be sensitive this year, and all the information from last year is still sitting on your company servers.

PII laws are complicated and can widely vary depending on which state and country you’re in, so it’s important to have processes in place to help eliminate extraneous data. Arguing for proportionality to narrow the scope of the case will reduce the amount of unnecessary PII gathered, and making use of technology assisted review and the many e-discovery platforms that can quickly find specific data inputs will dramatically reduce the time it takes to comb through files for PII. There are also products that can assist in the identification and exclusion of PII hiding in your case data, including redaction automation tools. While there are many methods for securing PII, redaction is far and away the safest because it removes sensitive information completely. It cannot be recovered or uncoded, so it is really the best way to eliminate risk.

The sticky nature of PII means that security can’t be done on a case-by-case basis. It should be a part of company-wide best practices and have a well-vetted process in place to ensure data is properly protected, not just for individuals but the company as a whole. Implementing security policies and investing in redaction technologies can help you stay compliant and save time, resources, and your reputation.

PROPOSED STATE BANS ON PHONE ENCRYPTION MAKE ZERO SENSE

 

Lock_Case
American politics has
long accepted the strange notion that just a pair of states—namely Iowa and New Hampshire—get an outsize vote in choosing America’s next president. The idea of letting just two states choose whether we all get to have secure encryption on our smartphones, on the other hand, has no such track record. And it’s not a plan that seems to make much sense for anyone: phone manufacturers, consumers, or even the law enforcement officials it’s meant to empower.

Last week, a California state legislator introduced a bill that would ban the retail sale of smartphones with that full-disk encryption feature—a security measure designed to ensure that no one can decrypt and read your phone’s contents except you. The bill is the second piece of state-level legislation to propose that sort of smartphone crypto ban, following a similar New York state assembly proposal that was first floated last year and re-introduced earlier this month. Both bills are intended to ensure that law enforcement can access the phones of criminals or victims when their devices are seized as evidence.

If consumers will cross borders to fill a booze cabinet, what’s to prevent New York criminals from foiling surveillance with New Jersey iPhones?

Those two proposed crypto bans have put another twist in an already tangled debate: The privacy and cryptography community has long opposed any such “backdoor” scenario that gives cops access to encrypted smartphones at the risk of weakening every device’s data protections. But legal and technical experts argue that even if a national ban on fully encrypted smartphones were a reasonable privacy sacrifice for the sake of law enforcement, a state-level ban wouldn’t be. They say, the most likely result of any state banning the sale of encrypted smartphones would be to make the devices of law-abiding residents’ more vulnerable, while still letting criminals obtain an encrypted phone with a quick trip across the state border or even a trivial software update.

Crypto Has No Borders

If the New York and California smartphone encryption bans passed, a company like Apple that sells encrypted-by-defaulted iPhones would have three options, argues Neema Singh Guliani, an attorney with the American Civil Liberties Union: It could cease to fully encrypt any of its phones, contradicting a year of outspoken statements on privacy by its CEO Tim Cook.  It could stop selling phones in two of America’s richest states. Or finally, it could create special versions of its phones for those states to abide by their anti-encryption laws.

The last of those scenarios is Apple’s most likely move, says Singh Guliani, and yet would result in a “logistical nightmare” that still wouldn’t keep criminals from encrypting their phones’ secrets. She compares the laws to state-wide liquor regulations: “People will travel over the border to buy alcohol in states with the standards that suit them,” she says. If consumers will cross borders to fill a booze cabinet, what’s to prevent New York criminals from foiling surveillance with New Jersey iPhones? “Nothing would stop those who wanted a more privacy protective phone to get one from out of state.”

In the hypothetical future where the state bills have passed, fully encrypting an iPhone might not even require buying an out-of-state device, but merely downloading out-of-state firmware. After all, it’s unlikely Apple would go to the expense of manufacturing different hardware for its phones to disable encryption in some of them, argues Jonathan Zdziarski, an iOS forensics expert who has worked with police to decrypt phones. “That would be a massive technical change to support this kind of device,” Zdziarski argues. “It would be literally cheaper for Apple to stop selling phones in California altogether.” Instead, he says, it would likely sell the same hardware for all of its devices and merely disable full-disk encryption through a different version of its firmware activated at the time of the phone’s purchase. And nothing in the current bills would prevent Apple from making the fully encryption-enabled version of its firmware available to anyone who restores their device from factory settings.

The technologically savvy will find ways to get encryption, while the average smartphone user’s data will be left more vulnerable.

In other words, that would make the New York and California crypto bans statewide bans on software, an idea roughly as practical as policing undocumented birds crossing the Mexican border. And if Apple were to try to accommodate the spirit of the law by preventing customers from restoring their phone with full-disk encryption inside California or New York, Zdziarski is confident iPhone owners could circumvent any location tracking, proxying their IP address or putting the phone in a Faraday bag to block its GPS. “This legislation is going to be technologically useless,” says Zdziarski. “Anyone who wants a device that doesn’t have law-enforcement-reversible encryption will be able to get one.”

Pressuring Congress

Neither Apple nor Google, which followed Apple’s lead last year by declaring that all devices running the latest version of Android will have default full-disk encryption, responded to WIRED’s request for comment on the California or New York bills. The office of New York Assemblyman Matthew Titone, who introduced the New York bill, tells WIRED that the state-level bill is meant to pressure Congress to follow with its own legislation. “When there’s no national legislation, states take efforts on their own to solve an issue,” says Titone’s chief of staff Chris Bauer. “That can speed the process along to make the federal government take steps.”

Skyler Wonnacott, the director of communications for the California bill’s sponsor Assemblyman Jim Cooper, offered a similar argument. “California is leading the fight…It’s got to start somewhere,” Wonnacott says. “Just because you can drive into Nevada and buy a phone or download software doesn’t mean there isn’t an issue and these phones aren’t used in crimes.”

Congress has yet to introduce legislation to limit full-disk encryption in smartphones, despite several congressional hearings over the last year in which officials, including FBI Director James Comey and New York District Attorney Cyrus Vance, warned of the dangers of allowing criminals access to devices with data they couldn’t decrypt. (Vance said at the time that New York police had been stymied by smartphone encryption 74 times in the nine months before the hearing, out of roughly 100,000 cases it deals with in a year.) A spokesperson in Vance’s office writes to WIRED that the DA’s office pushed for state legislation, and still hopes to find a compromise with device makers. “When Apple and Google announced the switch to full-disk encryption…with no regard for the effect it would have on local law enforcement and domestic crime victims, they left us with no choice but to seek legislative solutions at all levels, state and federal,” writes the district attorney’s director of communications Joan Vollero. “If the companies have a solution, we encourage them to engage in a productive dialogue.”

Constitutional Questions

But even if state laws do put pressure on Apple and Google to cave on encryption, they may do so unconstitutionally, says Andrew Crocker, an attorney with the Electronic Frontier Foundation. He says statewide smartphone encryption bans may fall under the “dormant Commerce Clause,” which gives the exclusive right to regulate commerce between states to the federal government. “States don’t have unlimited power to enact regulations to burden interstate commerce,” says Crocker. “If I’m Apple, this seems like a huge burden on my business.”

Congress, on the other hand, would have the power to ban default full-disk encryption in smartphones—though they’d do so against the advice of nearly every technical expert in the field of cryptography. In July of last year, for instance, 15 renowned cryptographers published a paper cautioning against any deliberate weakening of encryption for the sake of law enforcement. “New law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws,” the paper reads. “The prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.”

And Crocker reiterates that state-level bills wouldn’t be just problematic or risky, but “wildly ineffective,” as those who want encryption will easily get it from out of state—in either software or hardware form. The technologically savvy will use it to defeat police surveillance or to protect their phone from hackers and thieves, while the average smartphone user’s data will be left more vulnerable. “The ones who will actually be impacted are the less sophisticated people who don’t know how to get this protection,” says Crocker. “You’re looking at a cost that falls on innocent people, not criminals or terrorists.”

Judge David Campbell: Predictive coding rarely proposed, rarely used

Capture

This is the second part of an interview with US District Court Judge David Campbell, the former chair of the advisory committee responsible for drafting newly promulgated amendments to the Federal Rules of Civil Procedure. In part one, Judge Campbell addressed the prospective impact of the rule changes, and what their emphasis on proportionality and cooperation may mean in practice. He also outlined the evolution of the federal spoliation sanction rule, 37(e), which has been the focus of much debate and handwringing.

Here he discusses what other measures must accompany the rule changes to bring a substantive reduction in litigation costs, speed case resolutions, and reopen the federal court system to those it has priced out. We also ask him to share his experience with predictive coding and how it is — or is not — being used in his courtroom.

Logikcull: In the commentary to Rule 37, the committee noted that “The court should be sensitive to the party’s sophistication with regard to litigation in evaluating preservation efforts; some litigants, particularly individual litigants, may be less familiar with preservation obligations than others who have considerable experience in litigation.” Has anybody challenged that language as potentially giving a free-pass to litigants who are “willfully ignorant” — or maybe just lazy — when it comes to their preservation obligations?

Hon. David Campbell: I have not heard those challenges. It may well be that some folks have that concern. I will tell you, from my perspective, a very important background for this rule change is that the reality is that ESI in now in the possession of everybody. The personal injury plaintiff who walks into the lawyer’s office on crutches after the car accident has ESI that’s relevant to the injury, whether it’s their Facebook page, or the text they sent to their girlfriend after the accident, or communications with their doctors, or emails they might have sent. And that’s a very different world than we lived in 20 years ago. The reason for the comment you just read from the committee note is that if that person turns out to have not stopped Facebook’s deletion of posts — and I don’t know how Facebook deletes posts, but just use that as a hypothetical — a court can take into account their lack of sophistication in deciding later what Rule 37(e) measures should be imposed.

And I think that’s right. I don’t think we should hold them to the same standard as an entity that has an IT department. A number of the most prominent cases on the loss of ESI — I believe this is correct — deal with the plaintiff’s loss of ESI.

We had an expert tell us in one of these (rules committee) hearings that by 2018 — maybe he said 2020 — there will be 26 billion devices on the internet, which is, you know, four for every person on Earth. And the truth is, I believe, five years from now, 10 years from now, the amount of information that each person has in the cloud will be equivalent to the kinds of records that use to be found in the filing cabinets of entire businesses. So ESI, in my view, isn’t just a problem for the big entities. It’s a litigation issue for everyone, and this rule (Rule 37(e)) tries to take that into account.

“Five years from now, the amount of information that each person has in the cloud will be equivalent to the kinds of records that use to be found in the filing cabinets of entire businesses. So ESI, in my view, isn’t just a problem for the big entities. It’s an issue for everyone.”

Logikcull: Well, to that point, data growth is accelerating at an incredible rate. Is there any fear that whatever cost-lowering impact these changes will ultimately have will be negated by the fact that, not only is there going to be exponentially more data, but also that the technology available to handle that data does not appear to be getting much cheaper? Do you have that concern? I imagine you do.

Judge Campbell: I do. I think any judge or any lawyer involved in litigation should. We are hoping that these rule changes help through Rule 37(e) in bringing some level of uniformity of how you deal with the loss of information. We hope Rule 37(e) will reduce the amount of side litigation that occurs over loss of ESI and sanctions. The proportionality change and the case management changes, as well, are intended to get judges involved earlier in cases and work with the parties in figuring out how we get cases resolved efficiently given many factors, one of which is ESI which has to be dealt with in the case.

I don’t pretend to believe we’ve solved the problem. I think you’re right – we don’t really foresee the extent of the problem and it’s something that the courts are going to have to adapt to. But our intent is to at least make some progress on dealing with ESI through these amendments.

“I don’t pretend to believe we’ve solved the problem. We don’t really foresee the extent of the problem.”

Logikcull: Judge, what’s your sense of how well technology — and I’m now talking about the legal provider side, or e-discovery vendor side — has developed and been adopted by practitioners to counter these rising data volumes? I’m particularly interested in knowing your thoughts on, if you’ve even seen it, the utility of predictive coding… The field has put a ton of faith into that process and others, and into technology in general — and it seems to me, anyway, that the results have been somewhat underwhelming. Do you have an opinion on that?

Campbell: I’ve got some thoughts. I don’t know if they’re mature enough to call them an opinion [laughing]… I think it’s a reality that, going forward, we’re going to have to find technological solutions to this growth of ESI. The reality is that the old model of having lawyers or paralegals review every document that’s produced is not going to work. It just can’t work in a world where you’re dealing with millions of documents. We either incur enormous expenses continuing that model; or we surrender and produce everything — which I don’t think lawyers are going to do; or we need to find a technological way to winnow down the ESI to a manageable size.

So my view is, whether it can accurately be said that, today, technology is solving the problem, ultimately it’s going to have to solve the problem, because I don’t think the court system and lawyers are going to be able to continue dealing with it the way we dealt with paper evidence.

I will also say that I’m finding in my cases that predictive coding or technology-assisted review is rarely proposed by the parties and rarely used. I think that will change over time. I’ve seen more of it over the last few years. But it is still used in a very small percentage of the cases. And I am surprised at the number of big document cases where the parties do not use it, even when I suggest it. They instead prefer keyword searches.

I think part of that is the need to educate the bar through sophisticated litigators as to what the technology can do, and it is my hope that the predictions will prove true that it really can do this more quickly and more accurately than people.

But I’m not seeing it used widely in my cases, and, you know, Phoenix is the fourth or fifth biggest city in the United States. So I think we have our share of complex cases, and yet it is still a rare commodity in my cases.

“Whether it can accurately be said that, today, technology is solving the problem, ultimately it’s going to have to solve the problem, because I don’t think the court system and lawyers are going to be able to continue dealing with it the way we dealt with paper evidence.”

Logikcull: You mentioned the emphasis in the rule changes on making judges more proactive case managers. Judge [Paul] Grimm [of the federal District Court in Maryland], among others, has been pretty vocal about the idea that judges do not traditionally view themselves as case managers, but as dispute resolutionists. What’s your assessment of how well the judiciary will be able to take a more proactive role in facilitating some of these e-discovery issues?

Campbell: I have no illusion that simply re-writing the rules is going to transform judges from passive to active case managers. It clearly won’t. And as you know, the idea of active case management has been in the Federal Rules since 1983, when Rule 16 was revised to create an active role for judges. So we have been of the view that these rule changes need to be accompanied by a very significant education effort to encourage judges, in particular, to be more active case managers. Many are. But Judge Grimm is right that many still are not.

There are a number of steps we’ve taken. Members of the committee have written articles. We’ve created videos that are now on the Federal Judicial Center website that have been sent to every federal judge in the country explaining the rule changes. The committee has written letters to every chief district judge and every chief circuit judge in the country asking that they include the new rule amendments in district and circuit conferences next year to educate judges and lawyers about them. We’ve compiled materials that are available for judges on the FJC website — articles, PowerPoints and other things. And the Federal Judicial Center, which is the entity that trains judges, is intending to do more active training of judges. We’re hoping that that push, along with the rule changes, will bring about a behavior change on the part of us judges. Whether or not it works, we’ll have to see over time. But I’m hoping, at least with many judges, it will produce more active case management.

Logikcull: To stay on this theme of education, but now turning to educating the bar and practitioners… I get the sense, just as an outside observer, that there is an education gap. You have some small percentage of highly knowledgeable people who are technically competent who typically, though certainly not always, come from the largest law firms and largest corporations. And then you have some large remnant that tends to not understand this stuff at all. You’re at the federal court level. Is that what you’re seeing? Or are you seeing a general rising of education?

Campbell: Well I certainly see lawyers who understand the rules and are dealing with ESI better than others do. But I don’t think I would say that you find all of the best prepared lawyers in the large law firms or large corporations. I see many lawyers who are sole practitioners or who are in small firms or government attorneys who are right on top of ESI and the rules.

But there’s no doubt that many are not. I think that is changing. I alluded to it a moment ago, I’m seeing more and more — although it’s still a fairly small percentage — thinking about things like technology-assisted review. And I’m definitely seeing more who are dealing with ESI up front and talking about it at the Rule 26(f) conference. I’m hoping that the publicity that occurs in connection with these rules amendments — and there is a fair amount of publicity going on through various bar groups — will educate lawyers about these ESI issues in a way that they haven’t been before. As I said earlier, we have to deal with ESI in the federal courts. I think everybody involved is becoming more and more aware of it — and we’ll see lawyers become more sophisticated over time.

“I’m seeing more and more (lawyers) — although it’s still a fairly small percentage — thinking about things like technology-assisted review.”

Logikcull: Some practitioners see the expense associated with e-discovery as an access to justice problem. Judge John Facciola has gone so far as to say that e-discovery is contributing to making the federal court system a “playground for the rich.” Is that a sentiment that you share?

Campbell: I don’t think I would call it a “playground for the rich,” but I absolutely agree that too many people cannot afford to litigate in federal court. I do think the cost of federal litigation makes it unavailable to the average citizen. And I see many of them who are representing themselves struggling to handle a case because they can’t get a lawyer to take it because it doesn’t have enough money at stake. I think that’s a problem.

“I do think the cost of federal litigation makes it unavailable to the average citizen. And I see many of them who are representing themselves struggling to handle a case because they can’t get a lawyer to take it because it doesn’t have enough money at stake.”

That’s one of the problems we talked about at the 2010 conference that I mentioned. Part of our intent in putting the proportionality idea into the new rules and trying to get judges to actively manage cases more efficiently from the beginning, and we hope cutting down the side litigation over the loss of ESI, is to reign in the cost of discovery. There are other things that the Civil Rules Committee and other Judicial Conference committees are looking at to try to make civil litigation less expensive.

But in my view, it absolutely is a problem, and one that we need to work hard as a federal judiciary to solve.