If your theory was the correct one, could we use that effect to perform computation - perhaps condensing 1000s of years of computation into a few minutes, with the result encoded in which steady state it settles on?
But you'd need absurd reliability. Say you have something that condenses 1000 years of computation into a minute. That's ~500 million / 1 time compression.
What are the chances that if you ran 500 million copies of the machine with no time compression for a minute none of them would malfunction? Relatively small, right? Well, the single machine with time compression would end up doing effectively the same thing - the chance that it gets the right answer is probably much smaller than the chance that it goes up in a puff of smoke because the power supply freakishly died or something.
In order to do practical computation with such a system you'd need absurdly reliable computers, in absurdly safe environments. If you have a time compression ratio of 1 billion to 1, well, how many 1-in-a-billion things are there to go wrong? Better hope you're not near a faultline...