Hacker News new | past | comments | ask | show | jobs | submit login

The assumption that you are making is that polyrepos will spend the vast amount of engineering effort to maintain a stable interface. Paraphrasing Linus: “we never break userspace.”

In practice internal teams don’t have this type of bandwidth. They need to make changes to their implementations to fix bugs, add optimizations, add critical features, and can’t afford backporting patches to the 4 versions floating around the codebase.

Repos work for open source precisely because open source libraries generally don’t have a strong coupling between implementers and users. That’s the exact opposite for internal libraries.




> In practice internal teams don’t have this type of bandwidth

You don't need bandwidth to maintain backward compatibility in polyrepo. As you said yourself, you need loose coupling.

When you are breaking backward compatibility, the amount of bandwidth required to address it is the same in mono- and polyrepos (with some exceptions benefitting polyrepos).

The big difference though is whose bandwidth are we going to spend. Correct me if I'm wrong, my understanding is that at Google it's the responsibility of dependency to update dependents. E.g. if compiler team is breaking the compiler, they are also responsible for fixing all of the code that it compiles.

So you're not developing your package at your own pace, you are limited by company pace. The more popular a compiler is, the slower it is going to be developed. You're slowing down innovation for the sake of predictability. To some degree you can just throw money at the problem, which is why big companies are the only ones who can afford it.

> can’t afford backporting patches to the 4 versions floating around the codebase

Backporting happens in open-source because you don't control all your user's dependencies. Someone can be locked into a specific version of your package through another dependency, and you have no way of forcing them to upgrade. But if we're talking about internal teams, upgrading is always an option, you don't have to backport (but you still have the option, and in some cases it might make business sense).

> open source libraries generally don’t have a strong coupling between implementers and users. That’s the exact opposite for internal libraries.

I disagree. There's always plenty of opportunities for good boundaries in internal libraries.

Though I'll grant you, if you draw bad boundaries, polyrepo will have the problems you're describing. But that's the difference between those two: monorepo is slow and predictable, polyrepo is fast and risky. You can reduce polyrepo risks by hiring better engineers, you can speed up monorepo (to a certain degree) by hiring more engineers.

When there's competition, slow and predictable always loses. Partially that's why I believe Google can't develop any good products in-house: pretty much all their popular products (other than search) are acquisitions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: