Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've got a couple I've done and it's been really enjoyable.

I think the real value in using local models is exposing them to personal/unique information that only you have, thus getting novel and unique outcomes that no public model could provide.

1. Project 1 - Self Knowledge - Download/extract all of my emails and populate into a vector database, like Chroma[0] - For each prompt do a search of the vector store and return N number of matches - Provide both prompt and search result to LLM, instructing it to use the search result as context or in the answer itself.

2. Project 2 - Chat with a Friend - I exported the chat and text history between me and a good friend that passed away - I created a vector store of our chat history in chunks, each consisting of 6 back-and-forth interactions - When I "chat" with the LLM the a search is first conducted for matching chunks from the vector store and then using those as "style" and knowledge context for a response. Optional: You can use SillyTavern[1] for a more "rich" chat experience

The above lets me chat, at least superficially, with my friend. It's nice for simple interactions and banter; I've found it to be a positive and reflective experience.

0. https://www.trychroma.com/ 1. https://sillytavernai.com/



thank you tons. this is a great start and I'm gonna use these for learning :)

u rock


Pleasure is all mine :D Have fun and please reach out if you get stuck!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: