Open Source LLMOps Stack

Some background: I work on Langfuse and we've been collaborating with LiteLLM.

(LiteLLM is a Python library and proxy/gateway that handles cost management, virtual keys, caching, and rate-limiting for OpenAI or other LLM APIs. Langfuse manages LLM tracing, evaluation, prompt management, and experiments.)

We’ve each been building our open-source projects since early 2023 and learned that many devs and especially platform teams use the two together, so we created an integrated “OSS LLMOps stack.”

This is a fully self-hostable, technology-agnostic setup that lets you (1) Use LLMs via a standardized interface without adding complexity to the application; (2) Keep LLM Tracing, Evaluation, Prompt Management in-house for compliance; (3) Track cost and usage via a single interface, create virtual API keys for attribution of costs

It also enables direct transfer of LLM traces from the LiteLLM proxy to Langfuse. This simplifies the rollout of LLMOps practices (observability and evaluations) across multiple projects—you don't need to instrument all applications.

Additionally, the LiteLLM proxy can fetch and cache prompts from Langfuse's prompt management system, using them as templates for requests made through the proxy.

Both of these workflows can function without the integration, but are easier to manage with it!

We’d love your feedback!


Comments URL: https://news.ycombinator.com/item?id=43182241

Points: 38

# Comments: 4

https://oss-llmops-stack.com

Létrehozva 4h | 2025. febr. 28. 22:10:10


Jelentkezéshez jelentkezzen be