I once spent hours if not days debugging a problem with some code I had recently written because of this exact optimization.
It wasn't an embedded system, but rather an x86 BIOS boot loader, which is sort of halfway there. Protected mode enabled without paging, so there's nothing to trap a NULL.
Completely by accident I had dereferenced a pointer before doing a NULL check. I think the dereference was just printing some integer, which of course had a perfectly sane-looking value so I didn't even think about it.
The compiler, I can't remember if it was gcc or clang by this point, decided that since I had already successfully dereferenced the pointer it could just elide the null check and the code path associated with it.
Finally I ran it in VMware and attached a debugger, which skipped right over the null check even though I could see in the debugger the value was null. So then I went to look at the assembly the compiler generated, and that's when I started to understand what had happened.
It was a head-slapper when I found the dereference above. I added a second null check or moved that code or some such, and that was it.
Now map the hours and days spent into actual money, being taken from project budget, and then you realise why some business prefer some languages over others.
<laughs in embedded-system-with-no-MMU>