Just like precision without accuracy, but if you get accuracy you know you have a problem, but with only precision you are hopelessly misled.
If it was a strict trade-off, I would agree that the answer was somewhere on the middle. But it isn't. Most of the things that give you precision will also give you accuracy. The few things that let you exchange one for the other look too much like cheating, and seem to always have much greater impacts on accuracy than on precision.
> Most of the things that give you precision will also give you accuracy.
This is absolutely false - digital clocks are a common example. Precise to the second (or millisecond, or more!) but only as accurate as their setting.
Calculations also give precision without accuracy, like my package of tortilla chips - 13oz (368.5g) They took a measurement or specification precise to the ounce, multiplied by 28.3495, and got 368.5.
Accuracy and precision are independent variables, and the utility of either is usually limited to the order of magnitude of the other. With either "accurate to the second with a precision of milliseconds", or "accurate to the millisecond with precision to the second", you should only report a measurement to the second (or possibly a more precise measurement with an error range).
If it was a strict trade-off, I would agree that the answer was somewhere on the middle. But it isn't. Most of the things that give you precision will also give you accuracy. The few things that let you exchange one for the other look too much like cheating, and seem to always have much greater impacts on accuracy than on precision.