Show HN: Can I run this LLM? (locally)

One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.


Comments URL: https://news.ycombinator.com/item?id=43304436

Points: 21

# Comments: 26

https://can-i-run-this-llm-blue.vercel.app/

Creato 1mo | 9 mar 2025, 00:40:09


Accedi per aggiungere un commento