Many companies offer a payment to people who find bugs in their software. Google and Mozilla do it without a qualm. But when security vendors adopt the same practice, it seems a line has been crossed.
Last week Barracuda Networks offered a bug bounty, of between $500 to $3,133.70 (£310 to £1,950) for researchers finding security flaws in its products. The amount paid depends on the severity of the flaw, as judged by Barracuda’s in-house experts, and the researcher is allowed to publicise the flaw, and take applause, once it has been patched.
Any connection between security and money seems to be viewed as controversial – for instance, when an online marketplace from security exploits was launched by NSS Labs in September, several people were disconcerted. Surely selling exploits was a bad idea? Possibly as bad as the controversial move last month to publish the Firesheep attack code online?
Well, possibly. However, the NSS Labs marketplace turned out to be a place for legitimate penetration testers to pick up tools, saving themselves time and money.
And Barracuda’s announcement seemed to be designed to get maximum publicity, for minimum controversy. The company did, after all, set its fees at exactly the same level as Google did in its own bug bounty programme. Barracuda, it seemed, wanted to establish that its scheme was as legimiate as Google’s.
“There is a significant danger that it will attract developers into researching the vendor’s products and then offering them to the highest bidder,” said Anthony Haywood, chief technology officer at security firm Idappcom. “If the bug is a really serious one that cybercriminals can exploit to generate fraudulent revenue, there is a significant danger of the exploit information falling into the dark ecosystem that black hat hackers – as well as cybercriminals – now inhabit.”
Haywood is well aware that Google and Mozilla already do this, but apparently, now a much smaller company, in the security sector, is doing it, that’s increased the danger of it all going wrong. “Just because it is becoming the norm for the IT industry, does not make it in the long-term interests of our market sector,” he says.
By changing the way effort is rewarded, bug bounties are similar to the “litigate for free” industry of no-risk lawsuits, he says, which has brought about an increase in damage payments, and in frivolous claims, which in turn pushes everyone’s insurance premiums up.
By analogy, he says, if security firms pay for the bugs found in their software, this money must come from somewhere, and the move will push up everyone’s fees. “Someone, somewhere, has to pay for these types of services,” says Haywood. In this case, end users will have to fund the bounty schemes indirectly – and will also pay for a tide of malware the schemes will supposedly unleash.
“This is a cause and effect situation. No one really wins in the longer term from bug bounty programs. And that’s why we say that they are not in the real interests of our industry,” says Haywood.
I don’t really see that. For one thing, the “costs” of the bounty scheme are small compared with the damage an unknown zero-day flaw would cause if found and exploited by hackers outside the bounty scheme. People running these schemes can easily argue that they reduce overall costs.
Secondly, what is the tide of malware this is supposed to unleash? Does Haywood imagine that there are hackers out there who will be twiddling their thumbs wondering what to do until a bounty program gives them the idea of looking for flaws?
And if so will the existence of a possible $3,000 cheque make it more or less likely that they will turn the bug over to the proper destination – the software owner? The money may be vastly less than they could make with an exploit and a campaign of malware. The offer of money from the software vendor just makes it a bit more likely that honest people will get involved.
At bottom, Haywood’s objection to the Barracuda bounty is nothing more than the old “security by obscurity” idea. If we don’t talk about these bugs, Haywood seems to think fewer will be found. That’s an idea that has been debunked so thoroughly, it’s a surprise to see it crop up again.
Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…
Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…
Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…
Welcome to Silicon In Focus Podcast: Tech in 2025! Join Steven Webb, UK Chief Technology…
European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…
San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…
View Comments
The main question today should not be if to pay or not to pay for bounty program bugs. We should not expect that this kind of deep technical expertise should be provided by anyone for free. Product security is a serious discipline and not an afternoon hobby by enthusiasts. And if we can somehow expect that our other more-or-less complex equipment (from microwave oven to nuclear plant) are safe and secure I do not see why should be any difference for computers and software. At least when we are paying for it they are all assuring us it is secure (at least as much as the competitor's).
The main question is why (with so much money thrown in security training of developers and SDL efforts at vendors development labs) there are so many serious bugs in the code? Is it the lack of appropriate security testing? Is it the wrong approach to security control? Too much time for expensive automation without the realistic human-like attacking attitude? Too much security marketing? Well, I am sure that with a little bit of development optimization (like intentionally decreasing complexity) product teams can actually do much better by cleaning up the vulnerabilities before they are discovered in the wild.
Could someone finally calculate how much of human effort of the mass of users is spent each and every day for patching? Shouldn't we, paying customers, all deserve a better treatment of our vendors? We should at least expect to get more secure and stable product if we are getting the marketing impression while we are already paying for it.
Mr Haywood's assertions are based off of highly specious reasoning. Somehow, these bug bounties are going to cause a rise in malicious hackers creating exploits and malware. This is going to happen because legitimate security researchers are incentivised to responsible disclosure to the vendor? Mr. Haywood would have us believe that the answer is to buy one of his security appliances and that will solve all of our ills.
http://cosine-security.blogspot.com/2010/12/dear-mr-haywood-welcome-to-2010.html