Initially I started building this for my own projects. The main goal was to define a unified "blueprint" allowing me to develop multi-platform apps using the same code, without code generation. I wanted to be able to develop the same functionality on web, desktop, cli, server, mobile, whatever...
I've been able to achieve this by relying on TypeScript, a 4-layer architecture (UseCase => App => Product => Target) and dependency injection.
This mechanism allows me to use whatever tech stack I want, provided the good adapters are developed. For instance, I have pre-built ones : node express (server), next.js (server), node hono (server in alpha), node parseArgs (CLI), node stricli (CLI), react-web-pure (web with no CSS), react-native (mobile), node mcp server (anthropic MCP in alpha), etc.
The same goes with data storage : Postgres, SQLite, a txt file, whatever.
It also comes with auto testing and auto documentation.
Did I reinvent the wheel ? Probably on some aspects. Is it too much abstraction ? Probably as well. But I like the idea of modularity and portability.
That's why it's not made for everyone, nor all types of projects.
If you like testing new stuff, give it a try and feel free to ping me if needed, I'd love to help.
I'm aware the documentation is not state of the art yet. I wanted to focus on the "Getting Started" Guide to give a quick overview instead of going to deeply into the details.
Best,
Comments URL: https://news.ycombinator.com/item?id=42804961
Points: 15
# Comments: 3
Connectez-vous pour ajouter un commentaire
Autres messages de ce groupe
Have you gone back to in-person whiteboards? More focus on practical problems? I really have no idea how the traditional tech interview is supposed to work now when problems are trivially solvable
Article URL: https://www.theguardian.com/
Article URL: https://arxiv.org/abs/2412.05265
Comments URL: https://news.ycombinator.c
For anyone building weather-related AI apps, I am releasing an exciting iteration on last year’s model.
My Groundhog API is trained on 130 years of data and makes use of 82 separate data sources