100% agreed. When it comes to any kind of incrementing counter, IMO the limit should always be multiple orders of magnitude greater than the expected lifespan of the system.
Taking OP at face value, these devices' counter has a zero point on February 1, 2003 so if we don't know what happens when they roll over they must have been designed after that point.
In the world of the 2000s even 16 bit processing is old school, so there's no good reason this couldn't have been a 16 bit counter instead. If that had been the case the rollover wouldn't be until the year 3259.
IMO this is the correct way to handle situations where you think you don't need any more, round up from your reasonable limit to the next major "bit border". If you have a 10 bit counter, round up to 16. If you need 16, then make it 24 or 32. The limit shouldn't just be "above where we expect it to reach in its lifespan" but "if you hit this the device belongs in a museum".
Designing a device with a 10 bit week counter in the 2000s is bad design, prioritizing either laziness or cost over quality.
Taking OP at face value, these devices' counter has a zero point on February 1, 2003 so if we don't know what happens when they roll over they must have been designed after that point.
In the world of the 2000s even 16 bit processing is old school, so there's no good reason this couldn't have been a 16 bit counter instead. If that had been the case the rollover wouldn't be until the year 3259.
IMO this is the correct way to handle situations where you think you don't need any more, round up from your reasonable limit to the next major "bit border". If you have a 10 bit counter, round up to 16. If you need 16, then make it 24 or 32. The limit shouldn't just be "above where we expect it to reach in its lifespan" but "if you hit this the device belongs in a museum".
Designing a device with a 10 bit week counter in the 2000s is bad design, prioritizing either laziness or cost over quality.