Hacker News new | past | comments | ask | show | jobs | submit login

Nice work but, Sorry but I don’t feel comfortable either proxying my llm calls through a 3rd party unless the 3rd party is a llm gateway like litellm or arch or storing my prompts in a SaaS. For tracing, I use OTEL libraries which is more than sufficient for my use case.



If you use an OSS Gateway already, some (e.g. LiteLLM) can natively forward logs to Langfuse: https://docs.litellm.ai/docs/proxy/logging#langfuse

We are looking into adding an Otel Collector as OTel-semantics are maturing around LLMs. For now many features that are key to LLMOPs are difficult to make work with OTel instrumentation as the space is moving quickly. Main thread on this is here: https://github.com/orgs/langfuse/discussions/2509




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: