Another large codebase in C that deals with security. How thoroughly has it been audited and tested for bugs? Should we soon expect a new "Heartbleed"? Why or why not?
How would you lose portability by moving to a high-level language?
Further, why would timing attacks necessarily be more or less of an issue in a higher-level language than C? This is a genuine question.
I feel like in any case, there are some things that should just not be done in C. For example, the base64 encoder/decoder that gpg uses for ASCII-armoring should not be written in C; in general any parser should not be written in C; there's no reason for the code that interfaces with the user (for example, checking a password given to gpg-agent) to be written in C. You could write these in OCaml, use the FFI to natively link them into a single executable, and you'd have safety for string handling while retaining fine-grained control of the hardware during crypto.
However, I'm skeptical that you literally cannot use OCaml to implement crypto that's secure against timing attacks. You can always fall back to C for primitive operations, and you can control the garbage collector enough that gc pauses wouldn't be an issue.
IMO, the most compelling reason not to do this is that for all its fatal flaws, C is the most widely-used systems programming language, and so it has the largest share of experts ready to review code. But this doesn't seem like a good enough reason, given the severity of memory corruption bugs.
The way you lose portability by moving to a high level language is that you don't have an implementation of that high level language for as many platforms. You're not losing abstract portability in the sense that the semantics of the high level code is very highly machine-independent. But an actual port requires an actual toolchain.
C is not inherently portable, and it takes a lot of know how, care, testing, and sometimes luck, to write C that is portable.
C is de facto portable in the sense "able to be actually ported" by having the tools --- but possibly at the cost of combing the code for nonportabilities that have to be uprooted.
Typically gpg is used in asychronous communication. Ie, I encrypt or sign something and send the file to someone else who decrypts or verifies it at their leisure. So how can timing attacks be used?
I send you an email. Your mail client pipes it through gpg upon receipt. Meanwhile, I have a microphone pointed at your computer from across the room listening to your capacitors humming. Now I have the key you used to decrypt the message.
While using C automatically shields you from timing attacks and buys you portability? Right. It is not even supported on 64-bit Windows yet. (I'm not saying that there's a need to support 64-bit Windows; I'm just demonstrating that it's not portable. It's not portable because the codebase assumes some flavor of unix and because C is underspecified. long is 64-bit on 64-bit linux platforms, 32-bit on 64-bit MSWin. It's a C problem, not a Windows problem.)
How do Ada (or even SPARK) open opportunities for timing attacks? How are they less portable than any other gcc frontend (modulo the quite tiny runtime)?
Could all the downvoters explain why this isn't or shouldn't be a genuine concern? I know that GnuPG is not a network server, but there's a still lot of potential for various exploits in format parsers, etc.