Hacker News new | past | comments | ask | show | jobs | submit login

Hi! One of the creators of Chorus here. Really cool to hear how everyone is using it. We made this as an experiment because it felt silly to constantly be switching between the ChatGPT, Claude, and LMStudio desktop app. It's also nice to be able to run models with custom system prompts in one place (I have Claude with a summary of how CBT works that I find pretty helpful).

It's a Tauri 2.0 desktop app (not Electron!), so it is using the Mac's native browser view and a Rust backend. It also makes DMG size relatively small (~25mb but we can get it much smaller once we get rid of some bloat).

Right now Chorus is proxying API calls to our server, so it's free to use. We didn't add bring-your-own-api key to this version because it was a bit quicker to ship. This was kind of an experimental winter break project, so didn't think too hard about it. Likely will have to fix that (and add bring your own key? or a paid version?) as more of you use it :)

Definitely planning on adding support for local models too. Happy to answer any other questions, and any feedback is super helpful (and motivating!) for us.

UPDATE: Just added the option to bring your own API keys! It should be rolling out over the next hour or so.




Curious to check it out but a quick question — does it have autocomplete (GitHub copilot-style) in the chat window. IMO one of the biggest missing feature in most chat apps is autocomplete. Typing messages in these chat apps quickly becomes tedious and autocompletions help a lot with this. I’m regularly shocked that it’s almost year 3 of LLMs (depending on how you count) and none of the big vendors have thought of adding this feature.

Another mind-numbingly obvious feature — hitting enter should just create a new-line. And cmd-enter should submit. Or at least have it configurable for this.

(EDITED for clarity)


I don't think this would be good UX. Maybe when you've already typed ~20 chars or so. If it was so good at prediction from first keystroke, it'd had that info you're asking in the previous response. It could also work for short commands like "expand", "make it concise", but I can also see it being distracting for incorrect prediction.

> Typing messages in these chat apps quickly becomes tedious and autocompletions help a lot with this.

If you're on Mac, you can use dictation. focus text-input, double-tap control key and just speak.


In the editor there’s GitHub copilot autocomplete enabled in the chat assistant and it’s incredibly useful when I’m iterating with code generations.

The autocomplete is so good that even for non-coding interactions I tend to just use the zed chat assistant panel (which can be configured to use different LLM via a drop down)

More generally in multi-turn conversations with an LLM you’re often refining things that were said before, and a context-aware autocomplete is very useful. It should at least be configurable.

Mac default Dictation is ok for non technical things but for anything code related it would suck, e.g if I’m referring to things like MyCustomClass etc


Enter does continue the chat! And shift-enter for new line.

My Mac now has built in copilot style completions (maybe only since upgrading to Sequoia?). They're not amazing but they're decent.

https://support.apple.com/guide/mac-help/typing-suggestions-...


Sorry I meant hitting enter should NOT submit the chat. It should continue taking my input. And when I’m ready to submit I’d like to hit cmd-enter


I agree, but only personally. I would assume most people are on the “Enter to submit” train nowadays.

Most of my messaging happens on Discord or Element/matrix, and sometimes slack, where this is the norm. I don’t even think about Shift+Enter nowadays to do a carriage return.


There are a lot of basic features missing from the flagship llm services/apps.

Two or so years ago I built a localhost web app that lets me trivially fork convos, edit upstream messages (even bot messages), and generate an audio companion for each bot message so I can listen to it while on the move.

I figured these features would quickly appear in ChatGPT’s interface but nope. Why can’t you fork or star/pin convos?


The only editor I’ve seen that has both these features is Zed.


If it's using Tauri, why is it Mac only?


Only because I haven't tested it on Windows/Linux yet (started working on this last week!). But theoretically should be easy to package for other OS's




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: