My interests are more in general purpose tools for increasing accuracy, and perhaps using them to increase the accuracy of inaccurate tools through feedback systems.
That is, throw out chasing perfection of rigidity, dimension, and flatness of a mill (etc.) Instead keep adding measurement and feedback until the precision of the machine parts doesn't matter. (or, continue to amuse myself and annoy my partner by making messes and spending my disposable income trying)
Having tried this, with some fancy measurement tools and a pretty custom and high-performing control systems, I've concluded that it's a route to madness. Fancy software and feedback turn out to be a very hard way to emulate large, straight pieces of cast iron, and there's no substitute for rigidity once you actually start cutting - vibration is a huge issue, and very tricky to actively damp. Essentially every machine tool that does have crazy accurate realtime feedback (diamond turning for optics, ultra precision milling, etc) starts with a rigid and long-term stable frame and adds the $$$ controls.
Granted, my switch over to "more rigid more better" may have gone a bit too far - I now own a 10ee (3200lbs), and am looking for a good jig borer (2400lbs+).
You're right that one can't compensate for lack of rigidity with software but you can have an inaccurate but rigid structure and compensate on top of that. It's pretty common practice to use some sort of software compensation to improve accuracy (at least in certain applications where it matters).
Metal milling machine need to be super rigid because of the cutting forces... You can build super accurate machines (e.g. with granite) that can't be used for milling but can place a tool with a (sub-)micron accuracy.
Even with metal you can take lighter cuts and trade off some rigidity for accuracy, but then it's gonna take much longer to get anything done...
It can. Basically a repeatable machine can be made accurate by calibration. It doesn't have to be accurate to start with. If it's not repeatable (i.e. won't hold its calibration) then yeah, it won't ;) But accuracy != repeatability. I've worked on precision machinery (1um accuracy) and we used calibration techniques.
This is also my field. The issue is, the machine will be repeatible on the short term but not on the long term. It will look like it's behaving well during calibration and the next few months of operation.
But if the ways and mating surfaces aren't almost-perfectly straight and flat, they'll experience accelerated wear. (If you're using hydrostatic bearings, they won't work to begin with unless the surface is accurate). Then the calibration is gone. And that's just in the static case.
If your ballscrew has uneven pitch, is eccentric, or any number of other issues, you can calibrate it out. But now to move at a constant speed, your servo controller has to drive that inertia at a wobble, and everything shakes.
The machines I worked on had air bearings. No wear. Basically we're talking about something like an error of 10um over 2 meters. It's practically impossible, or cost prohibitive, to remove that error without calibration. The granite surface the air bearings ride on is locally flat but not perfectly accurate over its entire length.
I agree this setup is almost perfectly straight and flat. But it's still not accurate without the calibration.
Just to be clear, there's weren't milling machines...
My ideas are in the realm of developing detailed physical models of machines and developing control systems out of them using precise measurement tools to both tune the model and act as feedback mechanisms. (I'm not talking about machine learning)
I don't think my living room floor would support three tons of machine tools unfortunately.
It is, in my opinion, no coincidence that backlash and battiness both start with a B. Keeping up with (semi) predictable flexture would be... well, not a doddle as such, but... if you didn't also have to deal with backlash. Sure, there's a point where everything is, to a first approximation, made of rubber, but it's a lot easier to move that point by modifying or improving the tool than trying to trim the end of a diving board with a laser-assisted pole-mounted pruning saw.
My thinking goes, the end of the road is always "everything is rubber" when it comes to precision, so I might as well start off with that reality and see where it can take me.
On the other hand: wear. Even the best machines all experience wear all around and there is a lot of work keeping up with it or recovering from it. Keeping precision despite drift from wear and being able to trivially quantify it are really valuable.
The very best machines don't really ever wear out. The surfaces mate over the full area so even sliding bearings experience hydrodynamic contact. Or, they're built with fluid bearings to begin with.
(Assuming they're handled properly and kept clean, of course).
As msds was saying, if you're e.g. looking to cut metal you need to have a rigid structure. Otherwise the tool just bounces around given the cutting forces which really can't be compensated for. You entire structure deflects and vibrates under those forces. Accuracy is a different story, if you have a repeatable but not accurate system you can certainly fix the accuracy via calibration or feedback (feedback can be what gets you the repeatability).
Not only cutting forces; the machine will also warp and deflect as the axes move around and shift its weight. Even when all the sensors read that the tool position is correct, there's 6 degrees of freedom and a lot of coupled error from the stack up of the errors that each sensor doesn't measure. You can't really have accuracy if the machine isn't rigid, or if it has any play. And even if you do a calibration, if the ways aren't accurate to begin with then they'll wear and change.
You can certainly correct for all these errors if you are able to measure them. E.g. if your tool moves 1um down in position (x,y,z) due to the machine deflecting all you have to do is adjust z by 1um at that point. As long as there's no other issues and the only issue is the accuracy of placing that tool you're good. One example is precision ball screws, those typically come from the factory with data for compensating the screw. The screw isn't perfectly accurate on its own but if you apply the correction data you get better accuracy. The screw isn't less repeatable because of this, it doesn't wear down faster, it's just that it's absolute accuracy isn't perfect out of the factory so they just measure that and give you the corrections (let's say on the order of 5-10um over 1m). In usage the accuracy is also a function of temperature, so you may want to measure that and correct for it... or you may use a linear encoder or an interferometer ... point is, the screw does not have to be perfectly accurate, it just has to be repeatable and the remaining error can be corrected...
That is, throw out chasing perfection of rigidity, dimension, and flatness of a mill (etc.) Instead keep adding measurement and feedback until the precision of the machine parts doesn't matter. (or, continue to amuse myself and annoy my partner by making messes and spending my disposable income trying)