Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nobody said it is simple. Sure, the algorithmic complexity of the models are high, filters over filters, and in the same way the resulting dump file it is not editable (without unbalancing the rest of the data, i.e. tracking and modifying the bytes that was used by each token of the model; it is vectored data at bit level (In my case I don't see it exactly as part of a graph )).

Nevertheless, the above does not exclude what one can see, a lossy compressed database (data is discarded), where the indexes are blended within the format of the data generated by the model weights, main reason why the model weights are needed again to read the database as expected, for being used by the predictive algorithm that reconstruct the data from the query, query that conform the range of indexes triggering the prediction direction/directions.



*Where is read

> Nevertheless, the above does not exclude what one can see,

should be read as

> Nevertheless, the above (unknown format of the data conforming the dump file) doesn't mean that one can't see how the pattern works,


maybe you are so much more knowledgeable than me that this comment only appears to be a word salad.


oh, my apologies executive chef, now I understand, you appears to insinuate the data is not stored, they are a handful of unaligned logic gates spontaneously generating data.


yup youre just an idiot


you own the copyright!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: