Yeah, that sized database is likely to be a challenge unless the computer system it's running on has scads of memory.
One of my projects (DBHub.io) is putting effort towards working through the problem with larger sized SQLite databases (~10GB), and that's mainly through using bare metal hosts with lots of memory. eg 64GB, 128GB, etc.
Putting the same data into PostgreSQL, or even MySQL, would likely be much more efficient memory wise. :)
We have 200 GB databases in dolt format that are totally queryable. They don't work well querying on the web though - you need a local copy to query it effectively. Making web query as fast as local is an ongoing project.
Uh, personal question here. Where does your ~10G number come from? I pretty much run my life on the Apple Notes app. My Notes database is about 12G and now I’m scared.
Oh, it's just the vast majority of SQLite databases we see are pretty tiny. eg a few MB's at max
So, resource usage is pretty much not a consideration in any way when doing things with them.
Once people start doing things with multi-GB databases though, if the system they're running on has a small amount of memory (say 4GB, 8GB) then things can start going poorly.
Not like "crash and burn" poorly (so far). More like "tries to read 10GB of data into 4GB of ram" poorly. eg Dog slow, not lightning quick.
If your machine has a bunch of ram in it, you're likely safe . Though, I'd be making damn sure there are tested and working backups of it, just to be safe. :)
I missed this answer the first time around. Thanks so much for the reply. I always get machines with the maximum amount of RAM precisely for this reason.