Changelog
Latest release updates from the Langfuse team. Check out our Roadmap to see what's next.
All posts
New API: GET /metrics/daily
Retrieve aggregated daily usage metrics via a unified endpoint. Filters for application type, user, or tags are available for tailored data retrieval.
February 19, 2024
SDK-level prompt caching
The latest release of the Python and JS/TS Langfuse SDK includes a new prompt caching feature that improves the reliability and performance of your applications.
February 5, 2024
Custom model prices (Langfuse v2.0)
The new major version allows you to track usage and costs for many more models and for the first time also allows you to add custom models and prices.
January 29, 2024
Trace Tagging
Categorize and manage your traces with tags in Langfuse.
January 16, 2024
Prompt Management
Manage, version and deploy prompts from within Langfuse.
January 3, 2024
Customizable sorting for trace table
Finally, you can choose how to sort your trace table to find the slowest, fastest, or oldest traces with ease.
January 2, 2024
SDKs v2.0.0
We are excited to announce the release of v2.0.0 for Python and JS/TS SDKs. Langfuse has come a long way this year since initially launching v1 of the SDKs and it was time to finally group some necessary breaking changes into one release. The release includes simpler interfaces in Python, better defaults for the Langchain integration, and a performance upgrade for the JS/TS SDK. Read on for more details.
December 28, 2023
Sessions
Group multiple traces into a session and replay interactions between users and your LLM based application.
December 13, 2023
Rename and transfer projects
In the project settings you can now rename and transfer your project.
December 12, 2023
Change column visibility
All tables in Langfuse include lots of columns. You can now customize them to your workflow by hiding the ones you don't need.
December 4, 2023
More robust docker image
Removed cron job used for telemetry to increase stability of the docker image.
November 19, 2023
OpenAI SDK integration now supports v1.x and v0.x
OpenAI has released a major SDK update. The latest Langfuse integration now supports both the new and old versions, ensuring optimal backwards compatibility with other packages.
November 16, 2023
Improved UI for OpenAI Chat Messages
The JSON viewer is flexible but not super simple to read. Starting with OpenAI Chat Messages you can now view payloads in a pretty format to read and debug even faster.
November 15, 2023
Semantic releases
Langfuse now uses semantic versioning to increase the quality of the releases.
November 15, 2023
Support for Langchain Expression Language (LCEL)
LangChain Expression Language, or LCEL, is a declarative way to easily compose chains together. Langfuse natively supports the new expression language.
November 13, 2023
Day 1 support for GPT-4 Turbo
Accurate token counts and costs for GPT-4 Turbo
November 6, 2023
SSO enforcement
Optional SSO enforcement on an organizational level for advanced security.
November 3, 2023
Simplified Docker deployment
You can now upgrade to the latest version of Langfuse by pulling the latest container. Database migrations are applied automatically on startup. Optionally, one-click deploy via Railway.
October 31, 2023
RAGAS cookbook to evaluate RAG pipelines
RAGAS is an open-source package focused on evaluating RAG pipelines. With this new cookbook, you can use RAGAS evals on your production data in Langfuse.
October 30, 2023
OpenAI SDK Integration
Drop-in replacement for OpenAI's Python SDK to get full visibility by changing only one line of code.
October 25, 2023
Python SDK now supports Pydantic v1 and v2
The Python SDK uses Pydantic to validate the data you send to Langfuse at run time before it is asynchronously sent to the Langfuse API. Previously, the Python SDK used Pydantic v1 leading to incompatibilities with projects using Pydantic v2.
October 25, 2023
Ingestion APIs now 2x faster
We've removed a performance bottleneck in the API authentication which affected all API requests.
October 19, 2023
Dashboards out of alpha
New core dashboard with analytics on token usage, latencies, and scores/metrics.
October 9, 2023
Improved support of complex inputs and outputs (e.g. OpenAI functions)
If you pass JSON as a completion, it is rendered in the Langfuse UI.
October 7, 2023
Navigate quickly between traces
Browse traces and users super-⚡️. Jump between traces with keyboard shortcuts (k & j).
October 5, 2023
Complex filters
Use the new filter builder to filter across all sorts of attributes. State is persisted in the URL, so you can share your filters with others.
October 3, 2023
New JSON viewer
Inputs and outputs of spans and generations can be complex, now it is much easier to read them. Click to copy any value.
September 27, 2023
Datasets (beta)
Collect sets of inputs and expected outputs in Langfuse to evaluate your LLM app. Use evaluations to benchmark different experiments.
September 25, 2023
Export generations (for fine-tuning)
Filter generations and export them as CSV, JSON, or OpenAI-JSONL
September 18, 2023
Model-based evaluation
Run model-based evaluations on your production data in Langfuse using the Python SDK.
September 15, 2023
Share traces via public link
You can now make your traces public to discuss them with your team or share them more publicly.
September 14, 2023
Public access to analytics alpha
All users on cloud.langfuse.com have access to analytics dashboards. More analytics to come soon in core Langfuse project.
September 12, 2023
Usage in USD
In addition to token counts, usage can now be displayed in USD. Available for many popular models.
September 1, 2023
Track version and releases
Compare metrics across releases of your application or version of your chain/prompt. Releases are automatically tracked for major hosting providers.
August 31, 2023
Langchain integration (Javascript)
Now the Langchain integration is also available for Javascript.
August 30, 2023
🐳 Prebuilt Docker images
Self-hosting is now easier than ever with our prebuilt Docker images.
August 16, 2023
Public access to Q&A chatbot project
We've built the Q&A chatbot for the Langfuse docs, now everyone can access its production traces. Create an account on Langfuse Cloud and play with the Q&A chatbot to see your own traces.
August 14, 2023
GET API for usage tracking
Get usage data from Langfuse API for use in your own application and dashboards. Optionally broken down by user or use case.
August 7, 2023
Add manual scores
Do 'human-in-the-loop' evaluation by adding manual scores right in the Langfuse UI.
July 31, 2023
Langchain integration (Python)
LLM applications/chains/agents running on Langchain can now be traces via the native integration.
July 27, 2023
User management
Share projects with coworkers and manage permissions.
July 27, 2023
Nested traces
New user interface to browse complex traces in a nested way.
July 24, 2023
Analytics Alpha
For early-users on Langfuse Cloud, we are releasing an alpha version of our analytics dashboards.
July 20, 2023
Tokenization for exact token counts
For OpenAI and Anthropic models token counts are now automatically calculated. This is helpful for streaming endpoints that do not return usage data.
July 20, 2023
Launch
After working on the initial version for 2 months, we are excited to launch Langfuse today!
July 19, 2023