Show HN: Can I run this LLM? (locally)

One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.


Comments URL: https://news.ycombinator.com/item?id=43304436

Points: 21

# Comments: 26

https://can-i-run-this-llm-blue.vercel.app/

Created 4d | Mar 9, 2025, 12:40:09 AM


Login to add comment