One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.
Comments URL: https://news.ycombinator.com/item?id=43304436
Points: 21
# Comments: 26
Created
1mo
|
Mar 9, 2025, 12:40:09 AM
Login to add comment
Other posts in this group

Article URL: https://ieeexplore.ieee.org/document/1671509
Comments URL: ht

Article URL: https://github.com/matthewp/views-the-hard-way

Article URL: https://hypertext.tv/
Comments URL: https://news.ycombinator.com/item?id=43732805
Hey HN,
If you're like me, you have more startup ideas than free time, but turning those ideas into actual products often feels impossible. Coming up with a name, validating the idea, making a l