My use case is that I have a graph of flops, latches, buffers, AND, OR, NOT gates and I want to visualize how data is changing/getting corrupted as it goes through each of them.
I will say (and please forgive that digital circuits are not my field), there are almost certainly better techniques and approaches in the field to accomplish what you are trying to do. I would personally move away from what you are trying and seek insight in the domain that's able to produce multi-billion transistor microprocessors.
Perhaps there are tools for large-scale logic circuit simulation?
I did that a few years go. It was a nice visualization up until restoring division. With more components than that the layout just becomes to cluttered to be meaningful. And it is very difficult to encode all heuristics one uses when drawing "pretty" circuit diagrams by hand into an algorithm.