Intro to the project dashboard
The Overview tab provides you with several important metrics to help you understand the progress of your labeling project.
The Progress table shows the quantity of labels submitted, remaining, skipped, and the total percentage completed. You can also toggle between Overall (all users contributing to this project) and Mine.
The Labels Created chart shows the quantity of labels over time. You can toggle between daily, weekly, and monthly in the upper right corner of the chart.
Training data quality
The Training data quality section contains a Reviews table, where you can see the total number of un-reviewed, accepted, declined, ambiguous labels. For more information, see Review labels. "Coverage" refers to the proportion of total labeled assets to be reviewed.
On the right side of the Project overview tab, you can toggle between three tables and charts.
- The Object count table shows the total number of counts of each object and its percentage out of the total object counts. For example, if 1 out of 13 total objects is “Bird” in a labeled dataset, “Bird” would constitute 8% of the total objects in the dataset.
- The Classification Answers chart shows the number of each classification answer in the dataset.
- The Labels by Collaborator shows the label count completed by each member.
From the Labels tab, you can see activity of labeled images, label predictions on unlabeled images, and the queue of unlabeled images.
In the Activity table, you will see a complete list of all submitted labels in a project. You can choose from a dropdown list of filters to narrow down your search results. The activity section is also where you can access open review by clicking on any of the labels in the list. For more information on how to use open review and queue-based review, see Review labels.
The duration column reflects the total time each image spends getting labeled and reviewed. The timer will aggregate time across all users who worked on the label.
When an image undergoes labeling, the timer starts when the image is fully loaded and stops when the user clicks "skip" or "submit". The same logic applies when loading autosaved labels. To ensure only active time is captured, the timer automatically pauses when the user is inactive on the UI for 30 seconds and resumes when the user interacts with the keyboard or mouse or refreshes the page. If the user goes back to a previous label in the queue, the timer resumes after 3 seconds of going back to previously submitted or skipped label and the time is added to the duration spent labeling.
When an image undergoes review, the timer starts when the label loads and stops when the user moves on to the next label in the review queue.
The Queue table shows the labeling queue, which consists of the following:
- Unlabeled assets
- Assets that had labels but were deleted because they need to be relabeled
Assets in this queue are distributed among the labelers in the organization unless it is specifically reserved (indicated by “Reserved by” field). A reserved asset will become unreserved if it is not labeled within 90 minutes of being reserved.
Benchmarks are not yet available in the newest version of the Image Editor. However, for users still using the legacy Image Editor, see the documentation for Benchmarks.
The Performance tab allows you to monitor the labeling performance of each member within a project.
The Labels Created chart shows the quantity of labels over time. You may switch between daily, weekly, and monthly on the upper right of the chart.
Time per Label
The Time per Label chart shows the median time per Label over time. You may switch between daily, weekly, and monthly on the upper right of the chart. The time per Label is the median time the user spent generating each Label. The time starts when the data is loaded and ends when the Label is submitted. Note that if the browser tab is left open in the labeling environment, the time continues to accrue for that given asset.
In the Export tab, you can generate JSON or CSV exports of your labels.
For more details on export formats read our docs on exporting labels.
From the settings tab you can attach/remove datasets, modify the configuration of the label editor, manage members, adjust percentage of labels to be reviewed, and delete a project.
In the Datasets section is a complete list of datasets you can attach to and detach from your project. To add or remove data rows from a dataset, simply click on a dataset and select which data rows to add or remove. When you add additional assets to a dataset, they will automatically be added to the labeling queue. When a dataset is detached/removed from a project, all labels created against that dataset will remain in the project and all unlabeled data will be removed from the queue.
For instructions on importing data, read our Data import docs.
In the Label Editor section, you can make modifications to your Image Editor configuration.
From the tools tab of the editor configuration you can add and/or edit the configuration for objects, object classes, and classification questions. Note: In the legacy image editor, changing existing classes in the labeling interface will not update the classes of the existing labels.
You can also attach labeler instructions by clicking on the instructions tab. Having additional instructions can be helpful if you have a team of labelers that is working on a more complex labeling project.
In the Members section, you can add individual members, change member role, and set up an external workforce. For more information about modifying organization and project-based roles, see Add/remove a member.
Here you can adjust the percentage of your project’s images that you would like to be reviewed. For more information see Review labels.
In addition to the Projects dashboard, this is another place where you can delete a project. Deleting a project will also delete all the labels that have been submitted for a project.