I think it's an odd definition you are using. Chaos depends upon the sensitivity to those initial conditions, not what those initial conditions ARE. You do not need actual physical objects to see this. Conjecture that a double-pendulum system starts in a perfectly real-number-valued mythical reality where spacetime is perfectly continuous (a fiction). Now predict what state the system will be in at a bunch of 'close together' states. Then try to do some 'closer together' states. See if you can ever get them to line up. Do it on pencil and paper, of course, since you want to capture the full real-number-infinity effect.
Of course, you can't write down the real-number result with infinite precision, can you? You'll have to chop it off at some point. That's OK. That's what error bars are for. Just keep going and keep track of your error. Oops, you'll find that your error grows larger than the prediction itself with remarkable speed! THAT is the chaos. It doesn't matter if you measured the starting conditions with literal infinite perfect precision. The error bars on your predictions are going to explode like a busy beaver function. You're not going to catch that. Nothing can. (I imagine the actual physical limit would be that your predictions can be accurate within a distance equivalent to the radius of a sphere expanding at the speed of light from the initial position but I haven't actually read anything suggesting that, it just seems sensible.)
Then you've got stupid simple cellular automata like rule 31 1D systems. Totally discrete. Totally basic. No precision even needed anywhere. And yet, you can't predict it. Not without accumulating an error greater than the state in the system itself if you skip even the slightest act of computation necessary to evolve the system as a whole from one step to the next.
> You'll have to chop it off at some point. That's OK. That's what error bars are for. Just keep going and keep track of your error. Oops, you'll find that your error grows larger than the prediction itself with remarkable speed! THAT is the chaos.
If this were true, it should be trivial to prove by observation that we're not in a mathematical universe, or that physics is not computable, even in principle. As far as I understand, physics as we know it is widely acknowledged to be computable, albeit with exponential slowdown on classical computers.
By the Bekenstein bound and other thermodynamic constraints, a finite region of space necessarily contains a finite amount of information, or it would collapse into a black hole, ergo physics is at worst a finite state machine. So where's the disconnect?
Of course, you can't write down the real-number result with infinite precision, can you? You'll have to chop it off at some point. That's OK. That's what error bars are for. Just keep going and keep track of your error. Oops, you'll find that your error grows larger than the prediction itself with remarkable speed! THAT is the chaos. It doesn't matter if you measured the starting conditions with literal infinite perfect precision. The error bars on your predictions are going to explode like a busy beaver function. You're not going to catch that. Nothing can. (I imagine the actual physical limit would be that your predictions can be accurate within a distance equivalent to the radius of a sphere expanding at the speed of light from the initial position but I haven't actually read anything suggesting that, it just seems sensible.)
Then you've got stupid simple cellular automata like rule 31 1D systems. Totally discrete. Totally basic. No precision even needed anywhere. And yet, you can't predict it. Not without accumulating an error greater than the state in the system itself if you skip even the slightest act of computation necessary to evolve the system as a whole from one step to the next.