Logging LLM Inferences
OpenAI Assistant
To log your OpenAI Assistant chats, you can log the messages using the API logging method.
When you log to Athina, each chat message should be logged using a separate POST request.
You can group messages into a single conversation by using the session_id
field. This is equivalent to the thread ID from OpenAI.
-
Method:
POST
-
Endpoint:
https://log.athina.ai/api/v1/log/inference
-
Headers:
athina-api-key
: YOUR_ATHINA_API_KEYContent-Type
:application/json
Request Body
Here’s a simple logging request:
Note: OpenAI Assistant’s API doesn’t provide the actual prompt that was sent. So, the prompt you are logging might not be the complete prompt that was actually sent to the model.
However, the prompt field is not currently used for evaluations so it will not affect the ability to run evals