Athina home page
Search...
⌘K
Ask AI
AI Hub
Website
Sign-up
Sign-up
Search...
Navigation
Logging
Can I log inferences from any model?
Docs
API / SDK Reference
Guides
FAQs
Documentation
Open-Source Evals
Blog
Email us
Book a call
Data Privacy
Where is data stored?
On-prem deployment
Logging
Logging Latency
Is Athina using a Proxy method?
Can I log inferences from any model?
How to log conversations
Evals
Why use LLM-as-a-judge for evaluations?
Can I choose which model to use for running evaluations?
How do you manage costs for LLM evaluation?
Why Not Use Traditional Evaluation Metrics?
Logging
Can I log inferences from any model?
Yes! You may use any model of your choice, including custom models.
All you need to do is change the
language_model_id
you are sending while logging.
Is Athina using a Proxy method?
How to log conversations
Assistant
Responses are generated using AI and may contain mistakes.