In ML, this will bump you into a nearby optima at random. That optima may or may not be better than your prior. And if it’s worse, there’s a good chance it’ll be very difficult to get back to where you were without external checkpoints to revert to.
I wouldn't say the brain has no checkpoints. Certain events (like concussions or stroke) can knock out memories or habits and result in roughly what was some months or years ago. Also, certain things like Dissociative Identity Disorder can result in past/younger versions of yourself splitting off and halting development for years or more. Then they can come back up later.
With that said, I don't think anyone really has control over this. I'm a DID system and I have some sort of limited control but my time skips are usually on the order of days or weeks, not years.
In Machine Learning, we might think of this idea as "setting your Learning Rate too high"