The Apple Bug Bounty Program is a Godsend for Security Researchers

The Apple Bug Bounty Program is a Godsend for Security Researchers
Unfortunately for software developers, security flaws are just a fact of life. Even under the most rigorous internal testing, companies aren’t able to identify all the holes that hackers may one day find their way into. In fact, one of our security analysts unveiled a critical zero-day vulnerability that affects all versions of Apple’s OS X and some iOS versions.

Those who create the software are often too close to be able to see any vulnerabilities they’ve left behind. Usually, that kind of tunnel vision means finding out about the issue only after a system has been compromised—a situation no developer or user wants to find themselves in.

Fortunately, many companies have figured out there’s a better way to plug these holes: a bug bounty program. The premise is simple. Software developers and service providers challenge security experts to break into their product somehow, and if (when) they do, they’re rewarded for the discovery. That reward can be something as simple as recognition or free stuff—but it can also involve large payouts. It’s an idea so logical that one wonders why it took so long to catch on.

For one, logic and business aren’t always compatible endeavors. For a company like Apple, which prides itself on the security of its products, working with hackers may look like a desperate move. Credit that perception to the media portrayal of the hacker: the lonely guy surrounded by consoles, looking for back doors into network connected systems with ill-intent. This image is incorrect though, since being a hacker has nothing do with intent. It just means someone who’s tinkering away, possibly for the challenge, possibly with more malicious goals. In either case, Apple—historically proprietary and secretive about its technology—probably hasn’t wanted to betray the image they’ve worked hard to create by soliciting outside help with security matters.

The party line they’ve pitched for not participating in a bug bounty program was one of financial constraints. Whatever they’d pay out as a reward for finding security holes pales in comparison to what the black market will fetch—or what governments could pay. Now that they’re finally joining in, the amount of cash they’re offering—$200,000 at the maximum—is still nothing compared to what the U.S. government reportedly paid to break the security on one of the San Bernardino shooter’s iPhone.

Of course, this line of reasoning assumes that all hackers are bad hackers, which is certainly not the case. Consider the example of Max Justicz, an MIT student profiled in a Newsweek article titled The Rise of White Hat Hackers and the Bug Bounty Ecosystem. Justicz describes his search for security flaws in the most of harmless of ways: “If I have a spare evening, I’ll look for bugs. It’s what I do to procrastinate now.” If he finds something, he reports it, but that hasn’t always yielded the thanks he assumed he’d get. One company, after being contacted by Justicz to report the bug, politely informed him that they already knew he’d done it, and and they’d be following up with possible punitive measures.

As Newsweek reports, U.S. law on hacking is so general that it offers no wiggle room in any situation. Even someone who’s just tinkering, and who reports the bug out of good faith, is subject to arrest and fines for “unauthorized access.” For Justicz, no charges were filed and he’s been able to live his life as normal. But it was enough to keep him out of the servers of any company without a clearly defined bug bounty program policy that rewards—instead of punishes—those who use their talents for the greater good.

As Apple continues to expand its presence in the cloud, home to the personal data of millions of its users, this change in attitude is a win for its customers. By not establishing any kind of reward, or at least promising not to prosecute those who act as Justicz does, Apple guaranteed that the only people who will crack their security are those who don’t care about the consequences, the ones who do have malicious intent. It’s a policy that basically says, Our privacy is more valuable than yours—not at all comforting to those who have left a trail of private conversations, photos, and more under Apple’s control.

It’s not clear why Apple has decided to become more open and transparent as a company, and—it should be noted—they’re not exactly going whole hog into this endeavor. Participation in their bounty program, which begins in September, will be by invitation only. To receive an invitation, you must have previously disclosed valuable vulnerabilities to them in the past, which limits participation even more. But it’s still good news that they’ve shifted their attitude on this, and bodes well for people like Max Justicz, who hack for the fun of it and view any rewards as pure gravy. Given their new stance, their participation at Black Hat USA, and the company’s now publicly contentious relationship with the federal government, it’s pretty likely that even without an invitation, a disclosure to Apple isn’t going to result in a report to the authorities.

It wouldn’t be surprising, either, if this toes-in-the-pool first approach to bug bounties is a subtle wink to non-invitees that their findings won’t be held against them. And it’s a safe bet that any reward given would involve some kind of non-disclosure agreement, allowing Apple to maintain their decades’ long narrative of developing the safest computing platforms available. With this announcement, everyone wins. Apple steps up and starts looking more like a company who cares about their customers’ privacy. Well intentioned hackers don’t have to fear legal proceedings with a company that’s got more cash in the bank than most governments. Users can breathe a little easier knowing that, with more good people on the task of ensuring their security, malicious attacks and data theft are going to become even more difficult.