Hacker Newsnew | past | comments | ask | show | jobs | submit | om8's commentslogin

Have a similar project. Also written in rust, runs in a browser using web assembly

In-browser demo: https://galqiwi.github.io/aqlm-rs

Source code: https://github.com/galqiwi/demo-aqlm-rs



And reading the article mentioned there about "THE CHEAPEST FLASH MICROCONTROLLER YOU CAN BUY IS ACTUALLY AN ARM CORTEX-M0+" (2023) [1]

[1] https://jaycarlson.net/2023/02/04/the-cheapest-flash-microco...


Why? They acquired books, that’s what they do


The OP is referring to the ongoing legal struggles the IA is facing wrt. to their version of an online library (with digital book lending).


Precisely. To be clear, I don't agree with a comment upthread saying the "shoutout" is what might potentially do harm to the IA in court. I think the actual act of having scraped all those books from the IA's lending system could potentially do harm to the IA in court. The publishers can now point to all the copies of the books in the wild that IA had in their lending system and argue that IA's system is not legally acceptable. It was on shaky enough ground already.


I believe this was already brought up in the court proceedings, and Brewster Kahle already addressed it in April 2024: «Trying to blow protections we have put on files, for instance, does not help us– and usually hurts».

https://old.reddit.com/r/DataHoarder/comments/1bswhdj/commen...


IA lending books with "weak" DRM also hurts efforts in reducing DRM and reforming copyright though and that is much more important in the long term. It was always a deal with the devil that IA should have never made and them now being at odds with others that preserve those books and actually make them available only makes that more clear.

It's like a food kitchen under a tyrannical regime complaining that people passing their food to rebels might get them shut down.


The shout-out is evidence of the act.


Oh, ok. Thanks, I agree


Good rationalism includes empiricism though


llama.cpp is a mess and ollama is right to move on from it


ollama is still using llama.cpp. they are just denying that they are :)


Of course it is. GPT-5 is one of the most anticipated things in AI right now. To live up to the hype, it needs to be a reasoning model.


It’s unfortunate that llama.cpp’s code is a mess. It’s impossible to make any meaningful contributions to it.


I'm the first to admit I'm not a heavy C++ user, so I'm not a great judge of the quality looking at the code itself ... but ggml-org has 400 contributors on ggml, 1200 on llama.cpp and has kept pace with ~all major innovations in transformers over the last year and change. Clearly some people can and do make meaningful contributions.


To have a gpu inference, you need a gpu. I have a demo that runs 8B llama on any computer with 4 gigs of ram

https://galqiwi.github.io/aqlm-rs/about.html


Any computer with a display has a GPU.


Sure, but integrated graphics usually lacks vram for LLM inference.


Which means that inference would be approximately the same speed (but compute offloaded) as the suggested CPU inference engine.


I'm currently in Russia trying to get a US visa for my CS PhD. Because I do CS, I got into a thing called administrative processing. For 95% of people, it takes days -- weeks, tops. Because of the colour of my passport, it has already lasted for 3 months. I know people who are waiting for 2 years to pass it.

Why do you think I shouldn't have access to this website in Russia?


I believe he knows


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: