If you’re using OpenAI chat completions in Python, you can get set up in just 2 minutes
pip install athina-logger
import openai
with this:stream=True
and stream=False
for OpenAI chat
completions. OpenAI doesn’t provide usage statistics such as prompt and
completion tokens when streaming. However, We overcomes this limitation by
getting these with the help of the tiktoken package, which is designed to work
with all tokenized OpenAI GPT models.Is this SDK going to make a proxy request to OpenAI through Athina?
openai
from athina just makes an async logging request to Athina (separate from your OpenAI request) after you get back the response from openai
Will this SDK increase my latency?
What is AthinaMeta
AthinaMeta
fields are used for segmentation of your data on the dashboard. All these fields are optional, but highly recommended.You can view the full list of logging attributes here.