Evaluating prompt responses is a crucial part of the prompt development process. It helps you understand how well your prompt is performing and identify areas for improvement. In Athina’s Playground, you can evaluate prompt responses using any of our preset evals, or using a custom eval Here’s a demo video.Documentation Index
Fetch the complete documentation index at: https://docs.athina.ai/llms.txt
Use this file to discover all available pages before exploring further.