You can mitigate a bit of this by only ignoring blobs over a certain size like "--filter=blob:limit=256k" which should allow most ordinary text files through.
In the end it's the same as LFS though in that without a network examining old commits without a network is a bummer. No free lunch here besides something a bit more complex like git-annex.
Closer, but we're still relying on a proxy for the developer's intent.
The gitattributes file provides a version-controlled and review-controlled mechanism to decide exactly which objects get special treatment and which ones don't. Since its a part of the repository itself, you don't have to remind new developers to specify some unusual arguments to git at clone time to avoid a performance disaster.
> In the end it's the same as LFS though in that without a network examining old commits without a network is a bummer.
Except for a crucial detail: The tools I use to examine history are the log, diff[tool], and blame. All of those tools continue to function normally on an LFS-enabled offline clone. IIUC, `--filter=blob:none` doesn't work at all, and `--filter=blob:limit=256k` is a proxy which almost, but doesn't quite work.
In the end it's the same as LFS though in that without a network examining old commits without a network is a bummer. No free lunch here besides something a bit more complex like git-annex.