Digital Dangers

Cat-and-mouse game: Customers demand cybersecurity, law enforcement wants easier access to evidence

  •  
  •  
  •  
  •  
  • Print.

Cat and mouse

Photo illustration by Sara Wadford, Colin Robert Varndell, Lightspruch, wanw117/Shutterstock.com

On March 6, the iPhone’s encryption became no match for the Indiana State Police.

Armed with GrayKey, a tool that circumvents iPhone passwords and encryption, the agency was able to plug into dozens of iPhones in its possession and collect previously unattainable information for ongoing investigations.

Communications technology is “definitely making it more difficult for us to gather evidence, both technically and through service of legal process,” says Chuck Cohen, an Indiana State Police captain.

Within 60 days of obtaining GrayKey, the agency, with legal authority, was able to unlock 96 iPhones. The tool has led to both incriminating and exculpatory evidence, Cohen says.

The Indiana State Police, which spent $15,000 on the technology, is not alone. The Maryland State Police and police departments in Portland, Oregon, and Rochester, Minnesota, have also bought GrayKey, according to public records. The Drug Enforcement Agency is looking to spend $30,000 on an advanced model.

In response to this type of security-busting technology, Apple released a feature in beta to iOS 11.4.1 called the USB restricted mode, which requires a phone to be unlocked within the previous hour to allow for data transfer through the charging port, effectively neutering products such as GrayKey.

“We’re constantly strengthening the security protections in every Apple product to help customers defend against hackers, identity thieves and intrusions into their personal data,” Apple said in an emailed statement. “We have the greatest respect for law enforcement, and we don’t design our security improvements to frustrate their efforts to do their jobs.”

After unlocking a backlog of phones, Cohen says it’ll be hard to go back to the way things were.

“If a company makes a business decision that locks us out again, that causes me a lot of concern—more so than before,” he says. “Now, I know what I’ve been missing.”

Digital Dangers logo.

Cybersecurity and the law

A joint production of the ABA Journal and the ABA Cybersecurity Legal Task Force

Modern technology has created a honeypot of data for cyberthieves and law enforcement. This has led to a proliferation of default encryption features in consumer technology, which challenge criminal investigations. As technology companies engage in a cat-and-mouse game, police and prosecutors are trying to adapt by adopting novel technology and raising new constitutional arguments in court, which brings up privacy concerns from advocates.

“We live in a golden age of surveillance,” says Nate Cardozo, a senior staff attorney at the Electronic Frontier Foundation. “There’s more data generated and collected today than ever before in the history of humanity.” That data, he says, is largely accessible to law enforcement.

The Cat’s Out of the Bag

In reaction, people are turning to encryption to protect their digital communications and information.

In recent years, companies such as Apple and Google have added encryption as a default feature to their offerings. At the same time, products including Signal, Skype and WhatsApp have made encrypted communication commonplace and easy to use.

Even though broad consumer access to encryption is relatively new, the questions it raises are perennial, says Anita Allen, a professor at the University of Pennsylvania Law School.

“The government always wants to exploit a new technology to their advantage,” she says. “And the question is if there should be any limits on that.”

For law enforcement, encryption has created a “going dark problem,” which is when communications or data are legally seizable but unreadable.

“This challenge grows larger and more complex every day,” said FBI Director Christopher Wray in a 2018 speech. “Needless to say, we face an enormous and increasing number of cases that rely on electronic evidence. We also face a situation where we’re increasingly unable to access that evidence, despite lawful authority to do so.”

Recently, the FBI admitted to overcounting the number of locked devices in its possession and is conducting an internal audit to determine the correct number. The exact number of devices inaccessible to law enforcement nationwide with a valid warrant is unknown.

According to a New York Times report earlier this year, the Department of Justice has renewed calls for a legal mandate requiring that technology companies create a way for law enforcement to access encrypted communications or data. Often called a “backdoor,” proponents claim it can ensure that encryption’s integrity while allowing law enforcement access with legal authority.

“Requiring a backdoor is a bad idea for security,” says Ari Schwartz, managing director of cybersecurity services at Venable and previously a member of the White House National Security Council. “It might help law enforcement on a certain case in the short term, but it’s bad for security in general.”

The concern is that once a backdoor is created it is only a matter of time before nefarious actors exploit the intentional vulnerability. To Schwartz, the solution is not a backdoor, but creating securer systems and giving law enforcement more tools, which may include hacking with greater oversight or having law enforcement work more with technology companies.

 

Read more ...

Give us feedback, share a story tip or update, or report an error.