Yeah it’s interesting this got me thinking, lossless compression is just removing redundancy right - like it doesn’t introduce any ambiguity in the data (?)
So feeding AI compressed data might allow it to be more efficient with its limited resources … I had never considered that, it’s very interesting idea
You are correct, compression is essentially extracting latent features in the data and discarding the rest.
Auto encoder networks, or networks that have an auto encoder like structure (U-net) employ essentially compression internally in the model to extract latent features.
So feeding AI compressed data might allow it to be more efficient with its limited resources … I had never considered that, it’s very interesting idea