>There are eight OpenELM models in total – four pre-trained and four instruction-tuned – covering different parameter sizes between 270 million and 3 billion parameters.
2024 is shaping up to be the year of open source models from some big players. Great steps for privacy.
Yes, although I think elm came first, then pine. (edit: 86 for elm, 92 for pine, pretty much how I remembered it, although 86 was a couple of years too early for me)
I didn't have Apple releasing the weights to their AI model on my 2024 bingo card but I am glad to see it