Doesn't the inclusion of the digital accuracy checkers then decrease the efficiency, and mean you might as well use a completely digital computer?
Just supposing here, but interfacing digital with analogue probably is a poor middle ground between the versatility and ubiquity of purely digital computers (countless existing systems exist to do whatever you want, with optimised algorithms and chips to work with) and purely analogue (presumably gain efficiency advantage by not having to cater to versatile use-cases).
> Doesn't the inclusion of the digital accuracy checkers then decrease the efficiency, and mean you might as well use a completely digital computer?
Not really. Whatever output the analog computer returns can be digitized with no detriment to its performance, pretty much in the same way a sensor which measures a physical property can have its output fed into a digital system with negligible interference over the original measurement.
Also, the same rationale can be used to probe intermediate steps and automatically check for their accuracy, even if only during validation phase. This is a possibility that was definitely not available, say, 60-odd years ago.