No you don't need to download the indexes beforehand. The download always starts from the first piece/page (0). From there it essentially performs binary search and only downloads the least amount of pieces possible in order to satisfy the query.
SQLite is already optimized to minimize disk-seeks, so it already knows to do the bear minimum amount of reads - which translates great for even slower I/O such as this (network access is much slower than disk access).
Right, and it's also HTML/CSS/JS for site owners to program interactions of their site using SQL queries that are prioritized against the underlying torrent.
Imagine sharing wikipedia dumps, along with an index.html page. TorrentPeek will render the index.html and expose a `sqlTorrentQuery(yourQuery)` method so that site owners can program the site around that.
You'd be able to do stuff like:
<form onSubmit=() =>
sqlTorrentQuery('SELECT * FROM wikipedia MATCH ' + this.value + ' LIMIT 50;'),
result => console.log(result))> ... </form>
SQLite is already optimized to minimize disk-seeks, so it already knows to do the bear minimum amount of reads - which translates great for even slower I/O such as this (network access is much slower than disk access).