Noisy Labels for Instance Segmentation (COCO-format)
Collection
This collect the COCO-N, VIPER and VIPER-N, instance segmentation with label noise (as well as clean GTA5 data) • 3 items • Updated
• 2
image imagewidth (px) 1.92k 1.92k |
|---|
This dataset repo packages the VIPER images together with clean COCO instance segmentation annotations, as used in:
If you are looking for the noisy benchmark labels (annotations-only), see:
All datasets are grouped in this collection:
images/train/... (VIPER train images)images/val/... (VIPER val images)coco/annotations/instances_train2017.jsoncoco/annotations/instances_val2017.jsonreports/gallery/*/index.htmlfrom huggingface_hub import snapshot_download
viper_root = snapshot_download("kimhi/viper", repo_type="dataset")
images_root = f"{viper_root}/images"
ann_train = f"{viper_root}/coco/annotations/instances_train2017.json"
ann_val = f"{viper_root}/coco/annotations/instances_val2017.json"
print(images_root)
print(ann_val)
pycocotools
from pycocotools.coco import COCO
coco = COCO(ann_val)
img_id = coco.getImgIds()[0]
img = coco.loadImgs([img_id])[0]
print(img)
ann_ids = coco.getAnnIds(imgIds=[img_id])
anns = coco.loadAnns(ann_ids)
print("#instances in image:", len(anns))
Download both repos and swap the annotation JSONs:
from huggingface_hub import snapshot_download
viper_root = snapshot_download("kimhi/viper", repo_type="dataset")
viper_n_root = snapshot_download("kimhi/viper-n", repo_type="dataset")
images_root = f"{viper_root}/images"
ann_val_noisy = f"{viper_n_root}/benchmark/annotations/instances_val2017.json"
See the paper repo for scripts/recipes to generate/apply noisy labels to other COCO-format instance segmentation datasets:
Hugging Face’s built-in dataset viewer does not currently render COCO instance-segmentation JSONs directly.
You can still browse images in the Files tab, and use pycocotools/Detectron2/MMDetection to visualize masks.
@misc{kimhi2025noisyannotationssemanticsegmentation,
title={Noisy Annotations in Semantic Segmentation},
author={Moshe Kimhi and Omer Kerem and Eden Grad and Ehud Rivlin and Chaim Baskin},
year={2025},
eprint={2406.10891},
}
CC BY-NC 4.0 — Attribution–NonCommercial 4.0 International.