Covering everything you need to know in order to build AI products faster.
Get started with active learning
Discover how to get started with active learning by leveraging the 3 techniques that consistently help ML teams more quickly identify what data will most dramatically improve model performance.
How to scale up your labeling operations while maintaining quality
Many ML teams are eager to label all their data at once. However, this can actually increase time and cost. Learn how you can effectively build an iterative approach to your labeling operations to ensure quality while scaling.
How to maintain quality and cost with advanced analytics
Delays from quality management or the lack of insight into labeling quality can hinder model development. Learn how to maintain quality and cost with project performance dashboard and advanced analytics for Enterprise teams.
How to customize your annotation review process
Custom workflows can help optimize how labeled data gets reviewed across multiple tasks and reviewers. Workflows is a new feature that allows teams the flexibility to tailor their review workflows for faster iteration cycles.
How to search, surface, and prioritize data within a project
The Data Rows tab is the central hub for all data rows within a given project. You can view, manage, and filter for data rows within your project to better prioritize data for labeling and to accelerate model development.
How to prepare and submit a batch for labeling
High-quality training data is crucial to the success of any ML project. Rather than queueing an entire dataset for labeling, queuing Data Rows with batches gives teams greater control and flexibility in the prioritization of a project’s labeling queue.
A new way to queue & review
A migration guide for the switch to Batch-based queueing, Workflows, and the Data Rows tab.
SDK Changes: A new way to queue & review
What is changing? Labelbox is deprecating QueueMode, meaning all projects will be required to use Batch-based queueing for all projects by the end of Q1 2023. We will release a new SDK version that will automatically set up all new projects with this new functionality. Labelbox will release the following changes: Free / EDU / Starter customers can expect the changes on November 21st. Pro and Enterprise customers can expect the changes on a rolling basis starting December 14th. Automatic up
How to find and fix label errors
Learn how you can use Labelbox Model to visually compare your ground truths and predictions to identify and fix label errors.
How to curate and version your training datasets and hyperparameters
Learn how you can use Model to configure, track, and compare essential model training hyperparameters alongside training data and data splits. Easily track and reproduce model experiments to observe the differences and share best practices with your team.
Get started for free or see how Labelbox can fit your specific needs by requesting a demo