Hacker News new | past | comments | ask | show | jobs | submit | chrispogeek's comments login

Is this accessible via a local endpoint? I'd want to try calling it from other apps, for example.


It is! You can reach it from http://localhost:11434. More documentation to come, but to generate text there's an /api/generate endpoint: https://github.com/jmorganca/ollama#rest-api


I write the simplest incarnation of an idea onto an app, and just share it out with colleagues. I have no expectations. I'd say my hit rate is about 1/5. I consider it very high. It's also that high because my audience is my coworkers and I have a good idea of what they want. It's also that low because I just cobble together stuff that barely works, with the ultimate intent to see if i'd get more questions/requests out of it.


Any IP ownership issues with this? I.e. your employer claims you did it on their time?


> Software development is a labor-intensive process, meaning that the number of software engineers can determine a carmaker's competitiveness.

The ebb and flow of the layoff cycle...


I was just using tesseract.js and the repo looks active. Tesseract is still crap, but it's the free crap, so I'll just put up with it. Grayscale seems to improve the OCR. I'm sure there are tons of other techniques to improve the result


Anthropic?! Regardless of what it is, congrats!


brainchain.ai

General idea is gaining insights into supply chains using LLMs and other machine learning models for specific applications.


Any reading material about this?


Surfing Uncertainty is the book you’re looking for!


Sounds like GPT-speak to me...


I think COT would still be useful for guiding the LLM to customized context. Most of these examples refer to vanilla arithmetic/logic reasoning COT.


I think with Langchain the LLM picks the tools/APIs you provide to it.

With Jarvis the LLM picks the HF models available.


Owning up to the "wrong" is good in my book


You mean the "full accountability" in recent mass layoff notices ? =/


That's just some weird moral compass.

It's almost totally irrelevant if people own up to bring wrong, particularly about predictions.

I can't think of a benefit, really. You can learn from mistakes without owning up to them, and I think that's the best use of mistakes.


No, it’s not. Being willing to admit you were wrong is foundational if you ever plan on building on ideas. This was a galaxy brained take if I’ve ever seen one.


It's absolutely not weird. Saying "I was wrong" is a signal that you can change your mind when given new evidence. If you don't signal this to other people, they will be really confused by your very contradictory opinions.


I own up because it helps me grow personally and professionally, and if I’m not growing, what am I even doing?


I think this is a terrible take. It is so intensely important that one can admit they were wrong when new information comes to light.


Privately, sure. I don't think admitting it out loud makes better people.


The opposite of that is sticking to your statements which is stubborn and foolhardy. Owning up to it is courageous.

Which actually lends me to respect politicians who do that, and instead ridicule people who post old videos of Joe Biden or Obama or Hillary Clinton mandating heterosexual couples. A virtuous person is also open to adapting their convictions continually based on present day evidence and arguments - what is science otherwise?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: