If one thinks about the specification first and implements it afterwards then you have the ability to reason about your assumptions. You also have the ability to modify your implementation to address tangential concerns without breaking the contract with your specification.
Conversely if you write the implementation first then your canonical source of truth contains unknown bugs and assumptions. Your tests will be biased towards your assumptions of how it should work. It's a great way to defer the work of clarifying those assumptions and finding those bugs onto your users.
Catching bugs through integration tests is painful. Without a huge amount of infrastructure to catch stack traces, logs, and memory dumps it's a nightmare. Even when you have all of that information it can take days of effort to find a resource contention problem caused by an unfortunate sequence of events. TDD won't eliminate this but it does reduce the frequency and severity as your unit tests, if well designed, will isolate the problem in the great ball of mud.
Another interesting perspective TDD gives you is that when your tests become brittle and rely on far too many mocks you can be reasonably certain there is a problem with your specification. The same is not true if you write the software first. In that scenario you'd refactor by changing the implementation and writing the tests to confirm nothing. That's where you end up babysitting the test suite and go through the trouble of breaking up your code to be more testable. You try your best to eliminate tight coupling and implicit transfers of state upfront by writing the specification that way.
The truth, as is often the case, lies somewhere in the middle. Practice TDD by default, test-after if you absolutely must. Layer on integration tests. Try to do the best you can.
If one thinks about the specification first and implements it afterwards then you have the ability to reason about your assumptions. You also have the ability to modify your implementation to address tangential concerns without breaking the contract with your specification.
Conversely if you write the implementation first then your canonical source of truth contains unknown bugs and assumptions. Your tests will be biased towards your assumptions of how it should work. It's a great way to defer the work of clarifying those assumptions and finding those bugs onto your users.
Catching bugs through integration tests is painful. Without a huge amount of infrastructure to catch stack traces, logs, and memory dumps it's a nightmare. Even when you have all of that information it can take days of effort to find a resource contention problem caused by an unfortunate sequence of events. TDD won't eliminate this but it does reduce the frequency and severity as your unit tests, if well designed, will isolate the problem in the great ball of mud.
Another interesting perspective TDD gives you is that when your tests become brittle and rely on far too many mocks you can be reasonably certain there is a problem with your specification. The same is not true if you write the software first. In that scenario you'd refactor by changing the implementation and writing the tests to confirm nothing. That's where you end up babysitting the test suite and go through the trouble of breaking up your code to be more testable. You try your best to eliminate tight coupling and implicit transfers of state upfront by writing the specification that way.
The truth, as is often the case, lies somewhere in the middle. Practice TDD by default, test-after if you absolutely must. Layer on integration tests. Try to do the best you can.