Show HN: Onit – open-source ChatGPT Desktop with local mode, Claude, Gemini

Hey Hackernews- it’s Tim Lenardo and I’m launching v1 of Onit today!

Onit is ChatGPT Desktop, but with local mode and support for other model providers (Anthropic, GoogleAI, etc). It's also like Cursor Chat, but everywhere on your computer - not just in your IDE!

Onit is open-source! You can download a pre-built version from our website: www.getonit.ai

Or build directly from the source code: https://github.com/synth-inc/onit

We built this because we believe: Universal Access: AI assistants should be accessible from anywhere on my computer, not just in the browser or in specific apps Provider Freedom: Consumers should have the choice between providers (anthropic, openAI, etc.) not be locked into a single one (ChatGPT desktop only has OpenAI models) Local first: AI is more useful with access to your data. But that doesn't count for much if you have to upload personal files to an untrusted server. Onit will always provide options for local processing. No personal data leaves your computer without approval Customizability: Onit is your assistant. You should be able to configure it to your liking Extensibility: Onit should allow the community to build and share extensions, making it more useful for everyone.

The features for V1 include: Local mode - chat with any model running locally on Ollama! No internet connection required Multi-provider support - Top models for OpenAI, Anthropic, xAI, and GoogleAI File upload - add images or files for context (bonus: Drag & drop works too!) History - revisit prior chats through the history view or with a simple up/down arrow shortcut Customizable Shortcut - you pick your hotkey to launch the chat window. (Command+zero by default)

Anticipated questions:

What data are you collecting? Onit V1 does not have a server. Local requests are handled locally, and remote requests are sent to model providers directly from the client. We collect crash reports through Firebase and a single "chat sent" event through PostHog analytics. We don't store your prompts or responses.

How to does Onit support local mode? For use local mode, run Ollama. You can get Ollama here: https://ollama.com/ Onit gets a list of your local models through Ollama’s API.

Which models do you support? For remote models, Onit V1 supports Anthropic, OpenAI, xAI and GoogleAI. Default models include (o1, o1-mini, GPT-4o, Claude3.5 Sonnet, Claude3.5 Haiku, Gemini 2.0, Grok 2, Grok 2 Vision). For local mode, Onit supports any models you can run locally on Ollama!

What license is Onit under? We’re releasing V1 available on a Creative Commons Non-Commercial license. We believe the transparency of open-source is critical. We also want to make sure individuals can customize Onit to their needs (please submit PRs!). However, we don’t want people to sell the code as their own.

Where is the monetization? We’re not monetizing V1. In the future we may add paid premium features. Local chat will- of course- always remain free. If you disagree with a monetized feature, you can always build from source!

Why not Linux or Windows? Gotta start somewhere! If the reception is positive, we’ll work hard to add further support.

Who are we? We are Synth, Inc, a small team of developers in San Francisco building at the frontier of AI progress. Other projects include Checkbin (www.checkbin.dev) and Alias (deprecated - www.alias.inc).

We’d love to hear from you! Feel free to reach out at contact@getonit dot ai.

Future roadmap includes: Autocontext - automatically pull context from computer, rather than having to repeatedly upload. Local-RAG - let users index and create context from their files without uploading anything. Local-typeahead - i.e. Cursor Tab but for everywhere Additional support - add Linux/Windows, Mistral/Deepseek etc etc. (maybe) Bundle Ollama to avoid double-download And lot’s more!


Comments URL: https://news.ycombinator.com/item?id=42817438

Points: 8

# Comments: 1

https://github.com/synth-inc/onit

Établi 1d | 24 janv. 2025 à 22:50:06


Connectez-vous pour ajouter un commentaire