The identity of the hacking firm has remained a closely guarded secret for five years. Even Apple didn’t know which vendor the FBI used, according to company spokesman Todd Wilder. But without realizing it, Apple’s attorneys came close last year to learning of Azimuth’s role — through a different court case, one that has nothing to do with unlocking a terrorist’s device.
Five years ago, Apple and the FBI both cast the struggle over the iPhone as a moral battle. The FBI believed Apple should help it obtain information to investigate the terrorist attack. Apple believed that creating a back door into the phone would weaken security and could be used by malicious actors. The FBI sought a court order to compel Apple to help the government. Weeks later, the FBI backed down after it had found an outside group that had a solution to gain access to the phone.
The tale of the unlocking of the terrorist’s iPhone, reconstructed through Washington Post interviews with several people close to the situation, shines a light on a hidden world of bug hunters and their often-fraught relationship with the creator of the devices whose flaws they uncover. Azimuth is a poster child for “white hat” hacking, experts say, which is good-guy cybersecurity research that aims to disclose flaws and disavows authoritarian governments.
Two Azimuth hackers teamed up to break into the San Bernardino iPhone, according to the people familiar with the matter, who like others quoted in this article, spoke on the condition of anonymity to discuss sensitive matters. Founder Mark Dowd, 41, is an Australian coder who runs marathons and who, one colleague said, “can pretty much look at a computer and break into it.” One of his researchers was David Wang, who first set hands on a keyboard at age 8, dropped out of Yale, and by 27 had won a prestigious Pwnie Award — an Oscar for hackers — for “jailbreaking” or removing the software restrictions of an iPhone.
Apple has a tense relationship with security research firms. Wilder said the company believes researchers should disclose all vulnerabilities to Apple so that the company can more quickly fix them. Doing so would help preserve its reputation as having secure devices.
But many security researchers say it’s legitimate to sell these flaws to democratic governments. And the ability of government agencies to unlock iPhones has also spared Apple from direct conflict with these governments. For instance, by unlocking the terrorist’s iPhone, some say, Azimuth came to Apple’s rescue by ending a case that could have led to a court-ordered back door to the iPhone.
“This is the best possible thing that could have happened,” said Will Strafach, an iOS security researcher. The vendor that unlocked the phone, far from being unethical, potentially averted “a very bad precedent” for Apple “where everyone’s phone would have weakened security.”
Wilder said Apple supports “good faith” security research. “Our engineers work closely with the security community in numerous ways,” he said.
When contacted by The Post, the FBI, Azimuth, Wang and Dowd declined to provide a comment for this story.
An ‘exploit chain’
In September 2015, Apple released its new operating system, iOS 9, which it billed as having enhanced security to “protect customer data.” The new iOS was running on the iPhone 5C used by Syed Rizwan Farook, a public health inspector for San Bernardino County.
The FBI suspected the iPhone 5C might have valuable clues about why Farook and Tashfeen Malik opened fire on a holiday party at Farook’s office. Both Farook and Malik were killed in a shootout with police.
Before the attack, Malik had posted a message on her Facebook page, pledging loyalty to Abu Bakr al-Baghdadi, the leader of the Islamic State. (Baghdadi died in a U.S. Special Forces raid in Syria in 2019.) The FBI had few leads on whether the couple had accomplices or whether it was directed by the Islamic State, which was directing similar attacks around the world at the time. The FBI thought the contents of Farook’s iPhone 5C might provide useful information, such as who he had been communicating with in the lead-up to the attack.
But the phone, which belonged to Farook’s employer, was locked with Apple’s new security. In the past, the FBI could use software to quickly guess every possible combination of numbers for the four-digit passcode, a “brute force” effort that would normally take about 25 minutes. But the 5C included a feature that erased itself if the wrong password was entered more than 10 times.
Months of effort to find a way to unlock the phone were unsuccessful. But Justice Department and FBI leaders, including Director James B. Comey, believed Apple could help and should be legally compelled to try. And Justice Department officials felt this case — in which a dead terrorist’s phone might have clues to prevent another attack — provided the most compelling grounds to date to win a favorable court precedent.
In February 2016, the Justice Department obtained a court order directing Apple to write software to bypass the security feature. Apple said it would fight the order. Its argument: the government was seeking to force the company to break its own security, which could pose a threat to customer privacy.
“The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create,” Apple CEO Tim Cook wrote in a statement at the time. “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
All sophisticated software contains “bugs” or flaws that cause computer programs to act in unexpected ways. Not all bugs are significant, and on their own they don’t pose a security risk. But hackers can seek to take advantage of certain bugs by writing programs called exploits. Sometimes they combine a series into an “exploit chain” that can knock down the defenses of a device like the iPhone one-by-one.
Azimuth specialized in finding significant vulnerabilities. Dowd, a former IBM X-Force researcher whom one peer called “the Mozart of exploit design,” had found one in open-source code from Mozilla that Apple used to permit accessories to be plugged into an iPhone’s lightning port, according to the person. He found it even before Farook and his wife opened fire at the Inland Regional Center, and thought it might be useful at some point to develop into a hacking tool. But Azimuth was busy at the time with other projects.
Mozilla spokeswoman Ellen Canale said the company has no knowledge of any bug that was connected to the exploit.
Two months after the attack, Comey testified to Congress that investigators were still unable to unlock the terrorist’s iPhone. Seeing the media reports, Dowd realized he might have a way to help. Around that time, the FBI contacted him in Sydney. He turned to 30-year-old Wang, who specialized in exploits on iOS, the people said.
Using the flaw Dowd found, Wang, based in Portland, Ore., created an exploit that enabled initial access to the phone — a foot in the door. Then he hitched it to another exploit that permitted greater maneuverability, according to the people. And then he linked that to a final exploit that another Azimuth researcher had already created for iPhones, giving him full control over the phone’s core processor — the brains of the device. From there, he wrote software that rapidly tried all combinations of the passcode, bypassing other features, such as the one that erased data after 10 incorrect tries.
Wang and Dowd tested the solution on about a dozen iPhone 5Cs, including some bought on eBay, the people said. It worked. Wang dubbed the exploit chain “Condor.”
In mid-March, Azimuth demonstrated the solution at FBI headquarters, showing Comey and other leaders how Condor could unlock an iPhone 5C. Then, one weekend, the FBI lab did a series of forensic tests to be sure it would work without destroying data. The tests were all successful, according to the people. The FBI paid the vendor $900,000, according to remarks by Sen. Dianne Feinstein (D-Calif.) in May 2017.
FBI officials were relieved but also somewhat disappointed, according to people familiar with the matter. They knew they were losing an opportunity to have a judge bring legal clarity to a long-running debate over whether the government may compel a company to break its own encryption for law enforcement purposes.
On March 21, 2016, the government canceled a hearing scheduled for the following day on the legal case in California.
Soon after, the FBI unlocked the phone. Nothing of real significance — no links to foreign terrorists — was found.
Apple sought to recruit Wang to work on security research, according to the people. Instead, in 2017 he co-founded Corellium, a company based in South Florida whose tools help security researchers. The tools allow researchers to run tests on Apple’s mobile operating system using “virtual” iPhones. The virtual phones run on a server and display on a desktop computer.
In 2019, Apple sued Corellium for copyright violation. As part of the lawsuit, Apple pressed Corellium and Wang to divulge information about hacking techniques that may have aided governments and agencies such as the FBI.
Apple subpoenaed Azimuth, Corellium’s first customer, according to court documents. Apple wanted client lists from Azimuth, which is now owned by L3 Harris, a major U.S. government contractor, that might show malign entities such as, potentially, authoritarian governments. L3 and Azimuth said they were “highly-sensitive and a matter of national security,” according to court documents.
Last April, Apple also made a document request in the lawsuit for “all documents concerning, evidencing, referring to, or relating to any bugs, exploits, vulnerabilities, or other software flaws in iOS of which Corellium or its employees currently are, or have ever been, aware.”
Those employees included Wang. The request would have turned up Condor.
The judge denied the request in part.
During a deposition, Apple questioned Wang about the morality of selling exploits to governments, according to court records. A lawyer pressed him during the deposition on whether he was aware of any bugs that were not reported to Apple but were later found by malicious hackers.
Apple “is trying to use a trick door to get [classified information] out of him,” Corellium attorney Justin Levine said, according to a transcript. Corellium declined to comment for this story.
In its statement, Apple said the case “is about Corellium attempting to profit by selling access to Apple’s copyrighted works.”
In its lawsuit, Apple argued that Corellium has “no plausible defense” for infringing on Apple’s copyright, in part because it “indiscriminately markets its iPhone replicas to any customer, including foreign governments and commercial enterprises.”
Corellium has denied the allegation. It has countered that the lawsuit is an attempt to put it out of business following a failed effort by Apple in 2018 to purchase the company.
“If Apple wants to make their phones more secure against these government-affiliated bug hunters, then they should make their phones more secure,” said Matthew D. Green, a computer scientist at Johns Hopkins University, who has led research that found holes in Apple’s encryption. “They shouldn’t be going after people in a courtroom.”
In December, U.S. District Judge Rodney Smith in Fort Lauderdale, Fla., dismissed Apple’s copyright claims against Corellium. He ruled Corellium’s virtual iPhones do not violate Apple’s copyright because they are used to find security vulnerabilities, not compete with Apple sales. He deemed “puzzling, if not disingenuous” Apple’s allegation that Corellium’s products are sold indiscriminately.
The legal fight is far from over. Apple can appeal Smith’s ruling. And Apple has lodged another claim: that Corellium’s tools illegally bypass Apple’s security measures. That trial, which will be closely watched by security researchers, is set for the summer.
Meanwhile, Corellium can keep selling tools that help researchers find iOS bugs.
But all exploits have a shelf life.
A month or two after the FBI unlocked the terrorist’s iPhone, Mozilla discovered the flaw in its software and patched it in a routine update. So did vendors that relied on the software, including Apple.
The exploit was rendered useless.