Athina home page
Search...
⌘K
Ask AI
AI Hub
Website
Sign-up
Sign-up
Search...
Navigation
Annotation
View Annotated Data
Docs
API / SDK Reference
Guides
FAQs
Documentation
Open-Source Evals
Blog
Email us
Book a call
Getting Started
Introduction
Getting Started
Datasets
Introduction
Creating a dataset
Dynamic Columns
Run evaluations (UI)
View Metrics
Run Experiments
Compare datasets
Join Datasets
Export / Download Datasets
SQL Notebook
Automations
Manage Datasets
Delete a Dataset
Evals
Overview
Quick Start
Online Evals
Offline Evals
Preset Evaluators
Custom Evals
Running Evals in UI
Running Evals via SDK
Running Evals in CI/CD
Why Athina Evals?
Cookbooks
Flows
Overview
Concepts
Variables
Sharing Flows
Flow Templates
Blocks
Annotation
Overview
Metrics
Configure Project
View Data
Review Entries
Export Data
Permissions
Prompts
Overview
Concepts
Prompt Syntax
Create Prompt Template
Prompt Versioning
Delete Prompt Slug
List Prompt Slugs
Duplicate Prompt Slug
Run Prompt
Multi-Prompt Playground
Run Evaluations on a Prompt Response
Organize Prompts
Monitoring
Overview
Inference Trace
Analytics
Topic Classification
Export Data from Athina
Continuous Evaluation
Model Performance Metrics
Settings
Custom Models
Sampling Evals
Credits
Integrations
Integrations
Self Hosting
Self-Hosting
Self-Hosting On Azure
Datasets
Import a HuggingFace Dataset
On this page
Navigate to the Annotation Project
Annotators Tab – Progress and Agreement
Open the Annotation View
Annotation Interface
Filtering and Data Access
Annotation
View Annotated Data
Learn how to inspect completed annotations and monitor dataset coverage.
Once your annotation project is active, Athina provides tools to monitor label distribution, agreement, and project progress in real time.
Navigate to the Annotation Project
Go to the
Annotations
tab from the sidebar.
Select your project.
Use the tabs to explore:
Subsets
: Filter by specific dataset segments.
Annotators
: Monitor per-user progress and agreement.
Settings
: View or update the view configuration or guidelines.
Annotators Tab – Progress and Agreement
The
Annotators
tab summarizes team-level annotation status.
You can view:
Overall Project Progress
: Percentage of datapoints completed based on the required number of annotations.
Overall Agreement Rate
: Metric that tracks label consistency across annotators.
Score Distributions
for each label (e.g. “Relevancy”, “testA”).
Inter-Annotator Agreement Matrix
: Pairwise agreement comparisons between annotators.
Use these metrics to spot potential labeling inconsistencies and train annotators accordingly.
Open the Annotation View
Click
Open annotation view
to start annotating or reviewing entries.
The view layout is determined by the selected
Annotation View Configuration
in the project.
Fields can include markdown-rendered text, structured inputs, and contextual sections.
Annotation Interface
Navigation
:
Use
Next
,
Previous
, or
Skip
to move between datapoints.
Labeling
:
Annotators fill in labels as defined in the configuration (e.g. dropdowns, sliders, text areas).
Instructions
:
Click
Instructions
at the top right to reference guidelines at any time.
There is no toggling between different states — annotators label entries one by one using navigation buttons.
Filtering and Data Access
While reviewing in the main dashboard:
Use
Subsets
and
Annotators
tabs to filter views.
Click
View results in dataset
to switch to the dataset’s Sheet view and inspect raw + annotated data.
Keep annotation view configurations minimal and focused to streamline annotator flow and reduce errors.
Configure Project
Review Entries
Assistant
Responses are generated using AI and may contain mistakes.