Logging LLM Inferences
๐ LiteLLM
LiteLLM (GitHub ) is a popular open-source library that lets you use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs).
Integration
If youโre already using LiteLLM, you just need to add an Athina API key and callback handler to instantly log your responses across all providers to Athina.
Full Example
Additional information in metadata
You can send some additional information by using the metadata
field in completion. This can be useful for sending metadata about the request, such as the customer_id, prompt_slug, or any other information you want to track.
Following are the allowed fields in metadata, their types, and their descriptions:
environment: Optional[str]
- Environment your app is running in (ex: production, staging, etc). This is useful for segmenting inference calls by environment.prompt_slug: Optional[str]
- Identifier for the prompt used for inference. This is useful for segmenting inference calls by prompt.customer_id: Optional[str]
- This is your customer ID. This is useful for segmenting inference calls by customer.customer_user_id: Optional[str]
- This is the end user ID. This is useful for segmenting inference calls by the end user.session_id: Optional[str]
- is the session or conversation ID. This is used for grouping different inferences into a conversation or chain.external_reference_id: Optional[str]
- This is useful if you want to associate your own internal identifier with the inference logged to Athina.context: Optional[Union[dict, str]]
- This is the context used as information for the prompt. For RAG applications, this is the โretrievedโ data. You may log context as a string or as an object (dictionary).expected_response: Optional[str]
- This is the reference response to compare against for evaluation purposes. This is useful for segmenting inference calls by expected response.user_query: Optional[str]
- This is the userโs query. For conversational applications, this is the userโs last message.