The dataset progress metric shows the percentage of datapoints that have received annotations from at least the minimum required number of annotators (as configured in the annotation project).
For example, if your project requires at least 2 annotators per datapoint, the progress shows what percentage of datapoints have been annotated by 2 or more annotators.
This metric measures the average concordance between annotators who have annotated the same datapoints across the dataset. It represents the percentage of matching annotations when comparing each pair of annotators who worked on the same datapoints.
A high agreement rate (above 80%) typically indicates clear annotation guidelines and consistent understanding among annotators.
For the selected subset, this shows the average percentage of matching annotations when comparing pairs of annotators who worked on the same datapoints within that subset.
Comparing agreement rates across different subsets can help identify parts of your dataset that may have ambiguous examples or unclear annotation guidelines.
The annotator agreement rate measures the average percentage of annotations from the selected annotator that match with other annotators on the same datapoints.
An annotator with a significantly lower agreement rate than others may need additional training or clarification on the annotation guidelines.