I don't really understand the competiton. Do people come to these with just the intention of finding exploits, or do they come with the exploit ready, waiting to collect a reward?
Invariably all the researchers / organisations competing have developed their exploits well ahead of the event. The exploits are now pretty involved.
I've linked to a blog post from the Chrome developers that details the exploit that won late 2011 [1]. 'Pinkie Pie', the pseudonym of the person who won it, is pretty infamous in those circles.
AFAIK most have exploits ready before the event, and demonstrate them publicly (for the first time) at the event.
In general, skilled crackers/reverse engineers/security experts will look for new bugs -- and when found, can either a) Tell the vendor, b) Tell the world, c) Sell the exploit to the highest bidder, or d) Use the exploit for nefarious purposes themselves.
In general some combination of a) and b) or c) is the most common -- these events is a way to compensate people to do a) and b) -- and provide some incentive to avoid c) (and d)).
They have the exploits ready to go, the challenge is whether they can exploit the target system (which is fully patched) within their time slot.
It's a useful excercise, I think, in that it demonstrates that even the most hardened of codebases still has security bugs and it also serves as a cautionary tale for people who think they don't need multiple layers of defence..
"...it demonstrates that even the most hardened of codebases still has security bugs"
Browsers the most hardened codebase? I nearly spilled my coffee ; )
Every single browsers out there (including Chrome) was designed with security as an after-thought.
As for me I browse the Web from Linux, using a throwaway user account which doesn't have Java installed. And that user account is itself "hardened" (e.g. no login shell, specific per user-id firewalling rules, etc.). At this point seen the state of insecurity the Web is in I'll probably go back to the VM route (a browser in a locked down separate user account, but itself running inside a KVM VM).
My definition of an hardened codebase would be something like OpenBSD or OpenSSH or esL4 (in esL4 the code has been verified (using formal provers) to be free of buffer overrun/overflow and whatnots).
What I don't like about your comment is that you consider the current situation to be "acceptable". You apparently do really believe current browsers are "hardened" and that there are people thinking like you is precisely part of the problem.
We can do much better than that.
For a start I'd love to read a rant from Theo de Raadt about what should be done to conceive more secure web browsers.