Show HN: Can I run this LLM? (locally)

One of the most frequent questions one faces while running LLMs locally is: I have xx RAM and yy GPU, Can I run zz LLM model ? I have vibe coded a simple application to help you with just that.


Comments URL: https://news.ycombinator.com/item?id=43304436

Points: 21

# Comments: 26

https://can-i-run-this-llm-blue.vercel.app/

Utworzony 10d | 9 mar 2025, 00:40:09


Zaloguj się, aby dodać komentarz

Inne posty w tej grupie

Show HN: I made a worldwide sexual life dashboard

The idea is to share data-based insights about sexual life

I’ve worked in SexEd startups, and it’s wild that humanity doesn’t have this data. Most major academic studies have focused on sex prim

19 mar 2025, 02:20:06 | Hacker news
Show HN: "Git who" – A new CLI tool for industrial-scale Git blaming

I've always wanted a better way to explore the authorship data embedded in a Git commit log. I'm having fun building a CLI tool to do this.

It's a bit like the "Contributors" tab on Github that

18 mar 2025, 21:40:08 | Hacker news