Dataset Viewer
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code: StreamingRowsError
Exception: ArrowInvalid
Message: Failed casting from large_binary to binary: input array too large
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
return get_rows(
^^^^^^^^^
File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2543, in __iter__
for key, example in ex_iterable:
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2060, in __iter__
for key, pa_table in self._iter_arrow():
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2083, in _iter_arrow
for key, pa_table in self.ex_iterable._iter_arrow():
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 544, in _iter_arrow
for key, pa_table in iterator:
^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 383, in _iter_arrow
for key, pa_table in self.generate_tables_fn(**gen_kwags):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/lance/lance.py", line 230, in _generate_tables
yield Key(frag_idx, batch_idx), self._cast_table(table)
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/lance/lance.py", line 187, in _cast_table
pa_table = table_cast(pa_table, self.info.features.arrow_schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2224, in cast_table_to_schema
cast_array_to_feature(
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1795, in wrapper
return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1995, in cast_array_to_feature
return feature.cast_storage(array)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/features/video.py", line 262, in cast_storage
storage = array_cast(
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1949, in array_cast
return array.cast(pa_type)
^^^^^^^^^^^^^^^^^^^
File "pyarrow/array.pxi", line 1135, in pyarrow.lib.Array.cast
File "/usr/local/lib/python3.12/site-packages/pyarrow/compute.py", line 412, in cast
return call_function("cast", [arr], options, memory_pool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/_compute.pyx", line 604, in pyarrow._compute.call_function
File "pyarrow/_compute.pyx", line 399, in pyarrow._compute.Function.call
File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: Failed casting from large_binary to binary: input array too largeNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
RepCountA (Lance)
This repository contains RepCountA/LLSP split data in Lance format.
Dataset Contents
- This package stores train/validation/test as Lance splits under
data/*.lance. - Each row includes metadata and an embedded
video_blob.
Schema
video_id- stem id (without extension)source_name- original file name from annotation CSVsplit- one oftrain,validation,testaction_type- action categorycount- repetition count annotationcycle_bounds_json- JSON array of boundary frame markers from L1..L302video_ext- file extensionvideo_size_bytes- media file sizevideo_blob- raw media bytes
Notes
This package is built from local archive RepCountA.tar.gz (LLSP structure).
Countix Compatibility Subset
For internal compatibility use, this repository provides a Countix-compatible subset alias based on the full RepCountA package:
countix_compat_all= union oftrain+validation+test(all rows, no filtering)- Total rows: 1041
- Manifest:
subsets/countix_compat_all.tsv - Video IDs only:
subsets/countix_compat_video_ids.tsv - QA-ready TSV:
subsets/countix_compat_qa.tsv(includesvideo_id,count,question,answer,lance_uri) - Validated QA TSV (recommended):
subsets/countix_compat_qa_valid.tsv - Invalid QA rows report:
subsets/countix_compat_qa_invalid.tsv - QA rows with missing target count: 1 (
stu6_44)
Note: this is a compatibility alias for workflow integration and is not the official Countix benchmark release.
- Downloads last month
- -