Show HN: Can I run this LLM? (locally)

One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.


Comments URL: https://news.ycombinator.com/item?id=43304436

Points: 21

# Comments: 26

https://can-i-run-this-llm-blue.vercel.app/

Created 1mo | Mar 9, 2025, 12:40:09 AM


Login to add comment

Other posts in this group

Show HN: Too Many Business Ideas? stop choosing, launch all of them, FAST&FREE

Hey HN,

If you're like me, you have more startup ideas than free time, but turning those ideas into actual products often feels impossible. Coming up with a name, validating the idea, making a l

Apr 18, 2025, 11:50:09 PM | Hacker news