Review

Reviewing labeled data is a collaborative quality assurance technique. To reach state-of-the-art model performance, it often takes a dataset with hundreds-of-thousands to millions of quality labels. In which case, reviewing a significant percentage of those labels may very well require a team of reviewers working simultaneously to ensure quality. The Labelbox review tooling ensures that the collaborative review step is streamlined and transparent.

Review Queue

The review queue automatically distributes labeled images and tracks progress.

The review queue follows the following work distribution rules to streamline collaboration. 1. To ensure that labeling and reviewing operations can happen concurrently, only images that have been labeled, categorized, or skipped are entered into the review queue. 2. To ensure that the reviewers’ work does not overlap, all images in a reviewer’s queue are unique.

Set Up Review

Turn on review in Settings>Quality and choose the percentage of your project’s images that you would like reviewed.

Review Progress Metrics

The project overview in Labelbox shows the progress of your overall project across label and review operations. The review section tracks the remaining number of images to be reviewed, the number of thumbs up reviews, and the number of thumbs down reviews.

Ambiguous labels are ones where the upvotes and downvotes cancel each other out. Accepted labels are those where the majority upvoted. Declined are those where the majority downvoted. Unreviewed labels have no reviews at all.

Review Mode

The labeling and review queues are entered via separate tabs (Start Labeling and Review Labels, respectively). The interface for each mode is optimized to support maximum efficiency within each type of activity.


How did we do?