Show HN: I made an app to use local AI as daily driver
473 by xyc | 202 comments
Hi Hackers, Excited to share a macOS app I've been working on: https://recurse.chat/ for chatting with local AI. While it's amazing that you can run AI models locally quite easily these days (through llama.cpp / llamafile / ollama / llm CLI etc.), I missed feature complete chat interfaces. Tools like LMStudio are super powerful, but there's a learning curve to it. I'd like to hit a middleground of simplicity and customizability for advanced users. Here's what separates RecurseChat out from similar apps: - UX designed for you to use local AI as a daily driver. Zero config setup, supports multi-modal chat, chat with multiple models in the same session, link your own gguf file. - Import ChatGPT history. This is probably my favorite feature. Import your hundreds of messages, search them and even continuing previous chats using local AI offline. - Full text search. Search for hundreds of messages and see results instantly. - Private and capable of working completely offline. Thanks to the amazing work of @ggerganov on llama.cpp which made this possible. If there is anything that you wish to exist in an ideal local AI app, I'd love to hear about it.
Subscribe to:
Post Comments (Atom)
New exponent functions that make SiLU and SoftMax 2x faster, at full accuracy
New exponent functions that make SiLU and SoftMax 2x faster, at full accuracy 379 by weinzierl | 72 comments
-
Boards are dangerous to founder/CEOs 574 by tosh | 264 comments
-
Samsung plans $17B chip plant in Taylor, Texas 515 by kungfudoi | 370 comments
-
Stepping Back from Speaking 502 by alfredbez | 124 comments
No comments:
Post a Comment