Nima Ghorbani commited on
Commit
20b5ea6
·
1 Parent(s): 2fe587e

Use absolute resolve/main/ URLs for embedded videos in README

Browse files

HF's README renderer was serving the LFS pointer instead of the video
blob when given relative paths inside <video src=...>. Switching to the
explicit /resolve/main/ URLs (the same pattern the banner image uses)
makes the videos play inline on the dataset card.

Also swap the demo videos' inline style for the standard width="500"
attribute, which HF's HTML sanitizer keeps intact.

Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -33,7 +33,7 @@ Video spectroscopy beyond the visible spectrum, applied to object tracking in a
33
 
34
  Object tracking is a general computer-vision problem — the "object" could be a vehicle, a container, a piece of equipment, or an animal. In this demo the target class is humans: a crowded public scene with people in near-identical clothing is the hardest case for a standard RGB tracker and the clearest showcase for what hyperspectral features add. The same pipeline and the same spectroscopic arguments carry over to any other object class whose materials differ beyond the visible range.
35
 
36
- <video src="assets/UC1_invisible_ink_tracking.mp4" controls muted loop playsinline width="100%"></video>
37
 
38
  *Teaser — invisible-ink marking. One actor is sprayed with an ink that is invisible to the eye but bright in a specific spectral band; a Spectral Angle Mapper then tracks that signature with no model, no training, and no drift. The zoomed-in crop tracks the centre of the ink blob.*
39
 
@@ -78,7 +78,7 @@ The full tracking pipeline is built in [**Cuvis.AI**](https://docs.cuvis.ai) —
78
  ### Passive tracking
79
 
80
  <p align="center">
81
- <video src="measurements/cu3s/2026_04_15_16_28_10/Auto_000_gt_vs_pred.mp4" controls muted loop playsinline style="max-width:500px;width:100%;"></video>
82
  </p>
83
 
84
  *Ground truth vs prediction during the passive phase — the three actors walk in single file with mutual occlusions; the hyperspectral tracker preserves the correct ID lock on T.*
@@ -86,7 +86,7 @@ The full tracking pipeline is built in [**Cuvis.AI**](https://docs.cuvis.ai) —
86
  ### Active tracking
87
 
88
  <p align="center">
89
- <video src="measurements/cu3s/2026_04_20_16_28_54/Auto_000_combined.mp4" controls muted loop playsinline style="max-width:500px;width:100%;"></video>
90
  </p>
91
 
92
  *Combined view after T sprays D1 with the spectral ink — the marker becomes a deterministic, training-free signature that the tracker locks onto.*
@@ -94,7 +94,7 @@ The full tracking pipeline is built in [**Cuvis.AI**](https://docs.cuvis.ai) —
94
  ### All humans (baseline)
95
 
96
  <p align="center">
97
- <video src="measurements/cu3s/2026_04_15_16_28_10/Auto_000-all-humans.mp4" controls muted loop playsinline style="max-width:500px;width:100%;"></video>
98
  </p>
99
 
100
  *Untargeted run with every detected person tracked — the same pipeline configured for class-level tracking instead of single-target lock.*
 
33
 
34
  Object tracking is a general computer-vision problem — the "object" could be a vehicle, a container, a piece of equipment, or an animal. In this demo the target class is humans: a crowded public scene with people in near-identical clothing is the hardest case for a standard RGB tracker and the clearest showcase for what hyperspectral features add. The same pipeline and the same spectroscopic arguments carry over to any other object class whose materials differ beyond the visible range.
35
 
36
+ <video src="https://huggingface.co/datasets/cubert-gmbh/XMR_Demo_Object_Tracking/resolve/main/assets/UC1_invisible_ink_tracking.mp4" controls muted loop playsinline width="100%"></video>
37
 
38
  *Teaser — invisible-ink marking. One actor is sprayed with an ink that is invisible to the eye but bright in a specific spectral band; a Spectral Angle Mapper then tracks that signature with no model, no training, and no drift. The zoomed-in crop tracks the centre of the ink blob.*
39
 
 
78
  ### Passive tracking
79
 
80
  <p align="center">
81
+ <video src="https://huggingface.co/datasets/cubert-gmbh/XMR_Demo_Object_Tracking/resolve/main/measurements/cu3s/2026_04_15_16_28_10/Auto_000_gt_vs_pred.mp4" controls muted loop playsinline width="500"></video>
82
  </p>
83
 
84
  *Ground truth vs prediction during the passive phase — the three actors walk in single file with mutual occlusions; the hyperspectral tracker preserves the correct ID lock on T.*
 
86
  ### Active tracking
87
 
88
  <p align="center">
89
+ <video src="https://huggingface.co/datasets/cubert-gmbh/XMR_Demo_Object_Tracking/resolve/main/measurements/cu3s/2026_04_20_16_28_54/Auto_000_combined.mp4" controls muted loop playsinline width="500"></video>
90
  </p>
91
 
92
  *Combined view after T sprays D1 with the spectral ink — the marker becomes a deterministic, training-free signature that the tracker locks onto.*
 
94
  ### All humans (baseline)
95
 
96
  <p align="center">
97
+ <video src="https://huggingface.co/datasets/cubert-gmbh/XMR_Demo_Object_Tracking/resolve/main/measurements/cu3s/2026_04_15_16_28_10/Auto_000-all-humans.mp4" controls muted loop playsinline width="500"></video>
98
  </p>
99
 
100
  *Untargeted run with every detected person tracked — the same pipeline configured for class-level tracking instead of single-target lock.*