The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code: DatasetGenerationCastError
Exception: DatasetGenerationCastError
Message: An error occurred while generating the dataset
All the data files must have the same columns, but at some point there are 1 new columns ({'stats'}) and 2 missing columns ({'tasks', 'length'}).
This happened while the json dataset builder was generating data using
hf://datasets/BobShan/sim_transfer_cube_script_4_v2.1/meta/episodes_stats.jsonl (at revision ad08586736b71075fa6db9992abe0bce75ea8ac8)
Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
writer.write_table(table)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in write_table
pa_table = table_cast(pa_table, self._schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
raise CastError(
datasets.table.CastError: Couldn't cast
episode_index: int64
stats: struct<action: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, st (... 2118 chars omitted)
child 0, action: struct<min: list<item: double>, max: list<item: double>, mean: list<item: double>, std: list<item: d (... 33 chars omitted)
child 0, min: list<item: double>
child 0, item: double
child 1, max: list<item: double>
child 0, item: double
child 2, mean: list<item: double>
child 0, item: double
child 3, std: list<item: double>
child 0, item: double
child 4, count: list<item: int64>
child 0, item: int64
child 1, observation.images.angle: struct<min: list<item: list<item: list<item: double>>>, max: list<item: list<item: list<item: double (... 129 chars omitted)
child 0, min: list<item: list<item: list<item: double>>>
child 0, item: list<item: list<item: double>>
child 0, item: list<item: double>
child 0, item: double
child 1, max: list<item: list<item: list<item: double>>>
child 0, item: list<item: list<item: double>>
child 0, item: list<item: double>
child 0, item: double
child 2, mean: list<item: list<item: list<item: double>>>
child 0, item: list<item: list<item: double>>
child 0, item: list<item: double>
child 0, item: double
child 3, std: list<item: list<item: list<item: double>>>
...
t64
child 9, episode_index: struct<min: list<item: int64>, max: list<item: int64>, mean: list<item: double>, std: list<item: dou (... 31 chars omitted)
child 0, min: list<item: int64>
child 0, item: int64
child 1, max: list<item: int64>
child 0, item: int64
child 2, mean: list<item: double>
child 0, item: double
child 3, std: list<item: double>
child 0, item: double
child 4, count: list<item: int64>
child 0, item: int64
child 10, index: struct<min: list<item: int64>, max: list<item: int64>, mean: list<item: double>, std: list<item: dou (... 31 chars omitted)
child 0, min: list<item: int64>
child 0, item: int64
child 1, max: list<item: int64>
child 0, item: int64
child 2, mean: list<item: double>
child 0, item: double
child 3, std: list<item: double>
child 0, item: double
child 4, count: list<item: int64>
child 0, item: int64
child 11, task_index: struct<min: list<item: int64>, max: list<item: int64>, mean: list<item: double>, std: list<item: dou (... 31 chars omitted)
child 0, min: list<item: int64>
child 0, item: int64
child 1, max: list<item: int64>
child 0, item: int64
child 2, mean: list<item: double>
child 0, item: double
child 3, std: list<item: double>
child 0, item: double
child 4, count: list<item: int64>
child 0, item: int64
to
{'episode_index': Value('int64'), 'tasks': List(Value('string')), 'length': Value('int64')}
because column names don't match
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1339, in compute_config_parquet_and_info_response
parquet_operations = convert_to_parquet(builder)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 972, in convert_to_parquet
builder.download_and_prepare(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 894, in download_and_prepare
self._download_and_prepare(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 970, in _download_and_prepare
self._prepare_split(split_generator, **prepare_split_kwargs)
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1833, in _prepare_split_single
raise DatasetGenerationCastError.from_cast_error(
datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
All the data files must have the same columns, but at some point there are 1 new columns ({'stats'}) and 2 missing columns ({'tasks', 'length'}).
This happened while the json dataset builder was generating data using
hf://datasets/BobShan/sim_transfer_cube_script_4_v2.1/meta/episodes_stats.jsonl (at revision ad08586736b71075fa6db9992abe0bce75ea8ac8)
Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
episode_index int64 | tasks list | length int64 |
|---|---|---|
0 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
1 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
2 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
3 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
4 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
5 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
6 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
7 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
8 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
9 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
10 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
11 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
12 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
13 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
14 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
15 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
16 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
17 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
18 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
19 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
20 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
21 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
22 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
23 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
24 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
25 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
26 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
27 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
28 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
29 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
30 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
31 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
32 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
33 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
34 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
35 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
36 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
37 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
38 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
39 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
40 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
41 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
42 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
43 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
44 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
45 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
46 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
47 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
48 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
49 | [
"Aloha_sim_recorded_insert_dataset."
] | 400 |
0 | null | null |
1 | null | null |
2 | null | null |
3 | null | null |
4 | null | null |
5 | null | null |
6 | null | null |
7 | null | null |
8 | null | null |
9 | null | null |
10 | null | null |
11 | null | null |
12 | null | null |
13 | null | null |
14 | null | null |
15 | null | null |
16 | null | null |
17 | null | null |
18 | null | null |
19 | null | null |
20 | null | null |
21 | null | null |
22 | null | null |
23 | null | null |
24 | null | null |
25 | null | null |
26 | null | null |
27 | null | null |
28 | null | null |
29 | null | null |
30 | null | null |
31 | null | null |
32 | null | null |
33 | null | null |
34 | null | null |
35 | null | null |
36 | null | null |
37 | null | null |
38 | null | null |
39 | null | null |
40 | null | null |
41 | null | null |
42 | null | null |
43 | null | null |
44 | null | null |
45 | null | null |
46 | null | null |
47 | null | null |
48 | null | null |
49 | null | null |
End of preview.