Langchain
If you’re using Langchain, you can log your data to Athina with just a few lines of code.
All you need to do is to add the CallbackHandler
to your LLMChain
callbacks.
Install Athina Logging SDK
pip install athina-logger
Import Athina classes and set API key
Instantiate the `CallbackHandler` with Athina metadata properties
-
prompt_slug
: An identifier for the prompt that is being used. -
user_query
: The query that the user entered to the LLM. -
environment
: The environment in which the LLM is running. For example, production or development. -
session_id
: The session ID of the LLM. This is used to group multiple LLM calls together. -
customer_id
: The ID of the customer that is using the LLM. -
customer_user_id
: The ID of the user that is using the LLM. -
external_reference_id
: The ID of the external reference that is using the LLM. -
custom_attributes
: Any key-value data you want to associate with the LLM call -
kwargs
: Any key-value data you want to associate with the LLM calls in a chain. This key-value data will be stored as context in Athina Server
Add `CallbackHandler` to `LLMChain` callbacks
Supported Models (Without Streaming)
text-davinci-003
gpt-3.5-turbo
gpt-3.5-turbo-0613
gpt-3.5-turbo-16k
gpt-3.5-turbo-16k-0613
gpt-3.5-turbo-1106
gpt-4
gpt-4-0613
gpt-4-32k
gpt-4-32k-0613
gpt-4-1106-preview
meta-llama/Llama-2-13b
meta-llama/Llama-2-13b-chat
meta-llama/Llama-2-13b-chat-hf
meta-llama/Llama-2-13b-hf
meta-llama/Llama-2-70b
meta-llama/Llama-2-70b-chat
meta-llama/Llama-2-70b-chat-hf
meta-llama/Llama-2-70b-hf
meta-llama/Llama-2-7b
meta-llama/Llama-2-7b-chat
meta-llama/Llama-2-7b-chat-hf
meta-llama/Llama-2-7b-hf
claude-2
Supported Models (With Streaming)
text-davinci-003
gpt-3.5-turbo
gpt-3.5-turbo-0613
gpt-3.5-turbo-16k
gpt-3.5-turbo-16k-0613
gpt-3.5-turbo-1106
gpt-4
gpt-4-0613
gpt-4-32k
gpt-4-32k-0613
gpt-4-1106-preview
Not using Python?
Reach out to us at hello@athina.ai - we’re happy to add support for other stacks as well if we hear from you.