Hacker News new | past | comments | ask | show | jobs | submit login
Chatbox: Cross-platform desktop client for ChatGPT, Claude and other LLMs (github.com/bin-huang)
110 points by thunderbong 10 days ago | hide | past | favorite | 23 comments





I would avoid using Chatbox. If I remember correctly the releases in the past were built from code that is not in the repo and also there were unadressed issue with the license and positive VirusTotal scans.

There are other alternative tools out there that are less shady. One example would be jan.ai


I don't know if anything's actually changed here, but the issues discussing that and mentioning GPL are gone. Not "closed, fixed", but deleted instead. The author was claiming they delay the source release to prevent other projects from copying them, but argued with anyone raising the issue of that not being GPL compliant. It does not smell good.

The code currently says it's 0.10.4 (repo is for "community edition"), but the website changelog says 1.9.2.


Side question: For user without accounts, did ChatGPT start gating repeat visitors to its chat site behind some kind of supercookie or server-detected unique identifier?

Used to be able to throw a quick query at their site, now it just pushes me to a signup page (even after clearing cookies, incognito, etc).

So much for this announcement? https://openai.com/index/start-using-chatgpt-instantly/


Am using that for more than half a year now, very happy with it. The update process to get a new version is a mess though, with the update notification popping up four or more times usually.

Been a long time user of Chatbox as my Android client of choice. The recent web search update has been fantastic!

Hi, congrats to your product! At the top of the website you are saying "Whether it's documents, images, or code, just send your files to Chatbox." At the bottom you state "Everything stays on your device, giving you full control and peace of mind."

What is it then?


it's local, but I can see how the wording is ambiguous (not the author, so I can't vouch that there is no secret shenanigans)

There is very little I find more frustrating than software that could so easily be nice to use but isn't.

Would be great to learn what's your view on Chatbox in that regard.

Needs some usability testing - I suspect even five minutes of watching a non-technical user trying to use it would be very illuminating. UI is unintuitive throughout. Consider, one of the most frequent actions you will ever do in this type of app is start a new conversation, and here it is represented by a little button near the bottom of the sidebar where you might look for 'Settings' - apparently collapsing the sidebar is much more important. Tiny text, low contrast text, confusing collapsing/expanding sections, lack of whitespace and/or colour to differentiate message pair halves, confusing hover actions for copy, no floating or bottom of code block copy button... I'm pretty sure configuring API keys was weird for some reason as well, but I can't remember what it was.

How does it compare to OpenWebUI?

LibreChat is probably the one you want to compare to - Claude is not supported by OpenWebUI, and the shim will cost you dearly in terms of tokens.

Oh yes for Claude I use LiteLLM as a proxy to use it with OpenWebUI.

I'll try librechat too (never heard of it before) but I wonder if it has the same capabilities like voice and python tools. And ollama support (95% of my AI interactions are running locally)



I think that's probably the shim I was referring to - it has hardcoded context length, but it is either implemented incorrectly, Anthropic ignores it, or maybe it's on openwebui to manage the window and it just isn't? Not sure. I found it kept getting slow, so I was starting new conversations to work around that. Eventually I got suspicious and checked - I'd burned through almost $100 within a few hours.

LibreChat isn't as nice in some areas, but it's much more efficient in this regard.


I do exactly this, use LiteLLM to bridge it. In fact I use LiteLLM to bridge OpenAI and Groq too. Even though OpenWebUI supports them directly, with LiteLLM I can control better which models I see. Otherwise my model list gets cluttered up. I configured this back when OpenWebUI only supported one OpenAI endpoint but I kept using it because it's just quite handy.

And no it doesn't cost extra credits, isn't ignored and doesn't have hardcoded context length. It works perfectly.


Sorry, I was mistaken about having used LiteLLM - this is the one I was using: https://github.com/wangxj03/ai-cookbook.git, and you can see the hardcoded shit here: https://github.com/wangxj03/ai-cookbook/blob/main/anthropic-... - it does not actually restrict anything though, and like I said, I noticed it was getting slow when it was sending 100-200K token requests.

Also, it's pretty easy to find unresolved bugs related to openwebui not handling context length parameters correctly - I believe I actually read something from the author saying that this parameter is effectively disabled (for non-local LLMs maybe?).


File uploads don't work and the Linux Appimage won't start.

Is this like LM Studio?

Comparable but unlike LM Studio - it's open source.


Missed opportunity not naming this ChatterBox.

Very cool



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: