One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.
Comments URL: https://news.ycombinator.com/item?id=43304436
Points: 21
# Comments: 26
Creată
1mo
|
9 mar. 2025, 00:40:09
Autentifică-te pentru a adăuga comentarii
Alte posturi din acest grup



Hi HN,
I'm Will, and along with my co-founder George, we've built Zuni (https://zuni.app) - a browser extension that adds contextual AI capabilities to your browse
Article URL: https://www.poshenloh.com/e/
Comments URL: https://news.ycombinator.com/item?