You should have learned that the exponential growth is the precise reason why appending to dynamic growth arrays is O(1) – a very surprising and unintuitive result in my opinion. Not knowing about exponential growth, I would have sworn that that must be impossible. If this didn’t leave a strong impression on you, I would say that whoever taught you algorithmic complexity simply failed in this point.
the O(1) complexity is achieved as an amortized result, when taking the average time to append. which is exactly what we are allowed to assume for asymptotic analysis. however, i think we should be sensitive that real computers are machines in the real world can run into circumstances that differ greatly from the mathematical result.
for example, if you allocated a slightly too small array and then append to it just slightly too many items, you will trigger reallocation every time. ideally nobody ever writes code that bad and we get to assume that the O(1) behavior is always true. in practice we need to know how the actual algorithm works and make sure we don't accidentally code perverse cases.
Nope, that is not correct. Insert is always (amortized) O(1). This is worst time. There is no “bad case”. You only need to be aware that a single insert might take longer. But that is not relevant in an absolute majority of cases.
“Append to it just slightly too many items” – this “too many” makes sure that you appended enough items so that the expansion amortizes to O(1).
That’s the nice thing about a complete mathematical proof – it doesn’t leave any dodgy edge-cases.
You should know that in the real world basing everything on asymptotic behavior does not work very well. That's why we profile things, because reality is way more complex than CS classes.
Assuming with asymptotic behaviour, provided the constant factor is acceptable, ought to be your default. This is because software tends to need to work with greater volumes of data over time, on machines with more memory, and more and faster CPUs. If your algorithms grow with greater than linear complexity as a function of input, your software will tend to get worse over time.