CallbackHandler to your LLMChain callbacks.
1
Install Athina Logging SDK
pip install athina-logger2
Import Athina classes and set API key
3
Instantiate the `CallbackHandler` with Athina metadata properties
-
prompt_slug: An identifier for the prompt that is being used. -
user_query: The query that the user entered to the LLM. -
environment: The environment in which the LLM is running. For example, production or development. -
session_id: The session ID of the LLM. This is used to group multiple LLM calls together. -
customer_id: The ID of the customer that is using the LLM. -
customer_user_id: The ID of the user that is using the LLM. -
external_reference_id: The ID of the external reference that is using the LLM. -
custom_attributes: Any key-value data you want to associate with the LLM call -
kwargs: Any key-value data you want to associate with the LLM calls in a chain. This key-value data will be stored as context in Athina Server
4
Add `CallbackHandler` to `LLMChain` callbacks
Supported Models (Without Streaming)
text-davinci-003gpt-3.5-turbogpt-3.5-turbo-0613gpt-3.5-turbo-16kgpt-3.5-turbo-16k-0613gpt-3.5-turbo-1106gpt-4gpt-4-0613gpt-4-32kgpt-4-32k-0613gpt-4-1106-previewmeta-llama/Llama-2-13bmeta-llama/Llama-2-13b-chatmeta-llama/Llama-2-13b-chat-hfmeta-llama/Llama-2-13b-hfmeta-llama/Llama-2-70bmeta-llama/Llama-2-70b-chatmeta-llama/Llama-2-70b-chat-hfmeta-llama/Llama-2-70b-hfmeta-llama/Llama-2-7bmeta-llama/Llama-2-7b-chatmeta-llama/Llama-2-7b-chat-hfmeta-llama/Llama-2-7b-hfclaude-2
Supported Models (With Streaming)
text-davinci-003gpt-3.5-turbogpt-3.5-turbo-0613gpt-3.5-turbo-16kgpt-3.5-turbo-16k-0613gpt-3.5-turbo-1106gpt-4gpt-4-0613gpt-4-32kgpt-4-32k-0613gpt-4-1106-preview