Skip to main content

Visibility

Log prompt-response pairs using our SDK to get complete visibility into your LLM touchpoints. Trace through and debug your retrievals and generations with ease.

Online Evalutions

Automatically classify user queries into topics to get detailed insights into popular subjects and AI performance per topic.

Usage Analytics

Track LLM inference metrics like cost, token usage, response time, and more.

Log User Feedback

Track user feedback like clicks, ratings, and more.

Query Topic Classification

Automatically classify user queries into topics to get detailed insights into popular subjects and AI performance per topic.

Compare Metrics

Segment and compare metrics across different dimensions like prompt, model, topic, and customer ID.

Data Exports

Export your inferences to CSV or JSON formats for external analysis and reporting.