>It would be relatively simple to reason statically about the rate of memory allocation (iterate through all paths leading to a 'new' operator), but for this purpose you care about cases where an object becomes garbage and can be deallocated.
No I don't. All I care about is having enough free memory to make new allocations. I don't care how much garbage there is, I just care that when there's garbage it's being freed fast enough to support my allocations.
Just to state the obvious, in practice over the long term, you've got to be deallocating faster than you're allocating, or something nasty that will at the very least violate your performance constraints will happen.
Right. And my point is that the deallocation vs. allocation ratio is the only metric you really need. How fast garbage is made is completely irrelevant because over the long term it's bounded by the allocation rate. You don't have to solve the hard problem of figuring out how fast garbage can be made, you can solve the much easier problem of bounding allocation. And of course in either scenario you have to show that there are no leaks.
No I don't. All I care about is having enough free memory to make new allocations. I don't care how much garbage there is, I just care that when there's garbage it's being freed fast enough to support my allocations.