Hey HN,
Spark event logs run into 100s of MBs and offer a wealth of insight into your workloads but making sense of them has always been quite a bit prohibitive. We’ve recently built a lightweight tool that automatically parses Spark event logs and surfaces targeted insights to help you optimize your data jobs.
Whether you’re chasing down a bottleneck or balancing performance vs. cost, the profiler got you covered with real-time configuration recommendations, data skew analysis, and more.
Curious how it works in action? Check out this quick Loom video for a walk-through: https://www.loom.com/share/07348eb54f6b440da93f96753937792a?...
We’d love your feedback — check it out at https://app.datasre.ai and let us know what you think!
Comments URL: https://news.ycombinator.com/item?id=43637721
Points: 16
# Comments: 3
Zaloguj się, aby dodać komentarz
Inne posty w tej grupie
Article URL: https://jerf.org/iri/post/2025/go_layered_design/
Article URL: https://www.science.org/doi/10.1126/sciadv.adu1052