Every inference logged to Athina returns a prompt_run_id.
If you store this prompt_run_id, you can update the logs with additional information as your application has new events.
For example, you may log user_feedback to track user satisfaction with the responses.
You can do this by sending a patch request with prompt_run_id of the original inference log.
-
Method:
PATCH
-
Endpoint:
https://log.athina.ai/api/v1/log/inference/{prompt_run_id}
-
Headers:
athina-api-key: YOUR_ATHINA_API_KEY
Content-Type: application/json
Request Body
{
// ...Fields you want to update,
"user_feedback": 1,
}
For the full list of fields you can update, see the Logging Attributes.
Allowed fields to update
You cannot update the prompt_run_id or created_at fields.