Table of Contents

Intro to the project dashboard

Alex Cota Updated by Alex Cota

Overview tab

The Overview tab provides you with several important metrics to help you understand the progress of your labeling project.

Progress table

The Progress table shows the quantity of labels submitted, remaining, skipped, and the total percentage completed. You can also toggle between Overall (all users contributing to this project) and Mine.

Labels Created

The Labels Created chart shows the quantity of labels over time. You can toggle between daily, weekly, and monthly in the upper right corner of the chart.

Training data quality

The Training data quality section contains a Reviews table, where you can see the total number of un-reviewed, accepted, declined, ambiguous labels. For more information, see Review labels. "Coverage" refers to the proportion of total labeled assets to be reviewed.

On the right side of the Project overview tab, you can toggle between three tables and charts.

  • The Object count table shows the total number of counts of each object and its percentage out of the total object counts. For example, if 1 out of 13 total objects is “Bird” in a labeled dataset, “Bird” would make up 8% of the total objects in the dataset.
  • The Classification Answers chart shows the number of each classification answer in the dataset.
  • The Labels by Collaborator shows the label count completed by each member.

Labels tab

From the Labels tab, you can see activity of labeled images, label predictions on unlabeled images, and the queue of unlabeled images.

Activity

In the Activity table, you will see a complete list of all submitted labels in a project. You can choose from a dropdown list of filters to narrow down your search results. The activity section is also where you can access open review by clicking on any of the labels in the list. For more information on how to use open review and queue-based review, see Review labels.

Timer (duration)

Labelbox keeps track of label and review time and displays them in two separate columns within the Activity table for each data row.

The Label time column indicates the total time the creator of the label spends viewing or editing an unsubmitted label in the labeling interface. The timer starts when the image is fully loaded and stops when the user clicks "skip", "submit", or exits out of the labeling interface. To ensure idle time is not captured, the timer automatically pauses when the user is inactive on the UI for 30 seconds and resumes when the user interacts with the keyboard or mouse or refreshes the page. If the user goes back to a previous label in the queue, the timer resumes after 3 seconds and the time is added to Label time for that data row.

The Review time column indicates the total time all users who did not create the label view, edit, or review the submitted label in review mode. When an image undergoes review, the timer starts when the label loads and stops when the user moves on to the next label in the review queue.

Queue

The Queue table shows the labeling queue, which consists of the following:

  • Unlabeled assets
  • Assets that had labels but were deleted because they need to be relabeled

Assets in this queue are distributed among the labelers in the organization unless it is specifically reserved (indicated by “Reserved by” field). A reserved asset will become unreserved if it is not labeled within 90 minutes of being reserved.

Performance tab

This is where you can view the average metrics across all labelers or drill down into individual performance for Label time or Review time.

There are two sub-tabs — one for Labels and one for Reviews.

Average metrics across all labelers

On the right, you’ll see some automatically generated metrics. Click on a metric and select a date range to populate the chart.

Metric

Label sub-tab definition

Review sub-tab definition

Avg Count

Average number of labels created.

Average number of labels reviewed.

Avg Total Time

Average label time across all labels.

Average review time across all labels.

Avg Time per

Average time labeling per label.

Average time reviewing per label.

Individual performance

The second chart is a per user chart that displays the following individualized metrics when you click on a user’s name. Click on a metric to populate the chart.

Metric

Labels sub-tab

Review sub-tab

Count

Number labels created.

Number of labels reviewed.

Total Time

Total time spent labeling in project.

Total time spent reviewing in project.

Time per

Average labeling time per label.

Average reviewing time per label.

Export tab

In the Export tab, you can generate JSON or CSV exports of your labels.

For more details on export formats read our docs on exporting labels.

Settings tab

From the settings tab you can attach/remove datasets, modify the configuration of the Label editor, manage members, adjust percentage of labels to be reviewed, and delete a project.

Datasets

In the Datasets section is a complete list of datasets you can attach to and detach from your project. To add or remove data rows from a dataset, simply click on a dataset and select which data rows to add or remove. When you add additional assets to a dataset, they will automatically be added to the labeling queue. When a dataset is detached/removed from a project, all labels created against that dataset will remain in the project and all unlabeled data will be removed from the queue.

For instructions on importing data, read our Data import docs.

Label editor

In the Label Editor section, you can make modifications to your Label editor configuration.

From the tools tab of the "Configure editor" window you can add and/or edit your ontology for the project.

You can also attach labeler instructions by clicking on the instructions tab. Having additional instructions can be helpful if you have a team of labelers that is working on a more complex labeling project.

Members

In the Members section, you can add individual members, change member role, and set up an external workforce. For more information about modifying organization and project-based roles, see Add/remove a member.

Quality

Here you can adjust the percentage of your project’s images that you would like to be reviewed. For more information see Review labels.

Benchmarks is a QA tool for comparing Labels on an asset to a "gold standard".

Consensus is a QA tool for comparing a Label on an asset to all other Labels on that asset.

Automation

This is where you can toggle on/off Model-assisted labeling for your organization.

Danger zone

In addition to the Projects dashboard, this is another place where you can delete a project. Deleting a project will also delete all the labels that have been submitted for a project.

Was this page helpful?

How to create a project

Contact