Along with this "how long to spin up the new hire" issue, one of my first (if not the first) questions when trying to help people improve their processes related to software is:
> If the user/client asks you to make a small but not trivial change, how long would it take to update and deploy the program?
I have had answers ranging from "A couple hours" to "A year" (yes, they were serious). Most were in the 1-3 month range, though, which is pretty bad for a small change. It also makes it apparent why a bunch of changes get batched together whether reasonable or not. If a single small change, single large change, or collection of variable sized changes all take a few months to happen, might as well batch them all up. It becomes the norm for the team. "Of course it takes 3 months to change the order of items in a menu. Why would it ever be faster than that?"
Not sure how speed of the change is related to batches. The “batch” is related to “due this week”, there never was a single item in this list. “Speed” is related to “due which week”, that depends mostly on the priority of the change, not on how easy it is.
Upd. And “change menu items order, fast” is a sign of a problem. We found Mac Cube in ski vacation rental home once. It ran MacOS 10.2 or something. All the menu items were in the places we expected them to be! You think carefully first, then you implement menu items order. Upper Apple -> About this Mac. We managed to break their network config in like 5 minutes!
By running the exercise with a small change the constraints and behavior of the rest of the process get emphasized. As an example, in one team their test process was entirely manual and took a month. They ran that entire test suite for every release, whether the release should have affected the requirements being tested or not. Why? Mostly because they didn't know what changes in the release would affect what requirements, but that was another problem. This did encourage larger batch sizes though because if you have that large cost in your process and the cost is fixed regardless of batch size (100 changes or 1, you spend a month testing) you might as well batch more changes into the release. Having more releases means you incur this large fixed cost more often and overall reduces your throughput.
And I don't think I understand your update to your comment or you don't understand the point of that example from mine. It was illustrating the submission topic: normalization of deviance. Sure, you should think about where things should be but if a customer comes in and says, "Swap these two items" and you can't provide a working version with that single change for months then things have gone off the rails somewhere. I put it in quotes to reflect a statement like what I have heard from those teams I worked with. To them a long effort for a trivial change is normal, when it should be considered deviance.
That can extend the time, yes. I had a caveat about that but apparently edited it out before submitting. But that only explains or justifies the delays if there is value added. I work in aerospace and these were avionics and related systems. The bad ones did not have quality as a reason, though the good ones did. The bad processes were swamped with manual testing (which was non-comprehensive and error prone) or really tedious CM related activities (which was manual and error prone).
You can do a lot with good test automation, even in avionics. That cuts down a ton of the time and usually improves quality.
I'll also note, don't take my "deployed" too literally. I used that term because so many people here are working on server-based applications where that makes sense. Think "out the door". The exercise can only go as far as the team/org's reach. Deployed for avionics would mean more like, "At the flight test team". After that, it's up to someone else to schedule it and get it returned with issues or fielded.
Going beyond the team's reach without including those people (and thus making them part of the team, after a fashion) is guess work and opens up the blame-game. "It's all flight test's fault it takes a year to get out to the customer." Well, it takes you 9 months to get it to flight test and them 3 months to get done. So why does it take you 9 months? If you have a good reason (complex system, lots to test) then that's valid. If it's a simpler system, 9 months to get it to flight test is probably not justifiable.
I totally agree that "process" on its own doesn't justify long turnarounds. If you had one part of your code that was taking 2/3 of the run-time, it should get plenty of scrutiny, and the same is true of processes.
> If the user/client asks you to make a small but not trivial change, how long would it take to update and deploy the program?
I have had answers ranging from "A couple hours" to "A year" (yes, they were serious). Most were in the 1-3 month range, though, which is pretty bad for a small change. It also makes it apparent why a bunch of changes get batched together whether reasonable or not. If a single small change, single large change, or collection of variable sized changes all take a few months to happen, might as well batch them all up. It becomes the norm for the team. "Of course it takes 3 months to change the order of items in a menu. Why would it ever be faster than that?"