The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
⚠️ !!! 等待信息,填充链接
Contents
About the Dataset
- This dataset was created using LeRobot
- ~130 hours real world scenarios
- Main Tasks
- FlattenFold
- Single task
- Initial state: T-shirts are randomly tossed onto the table, presenting random crumpled configurations
- Manipulation task: Operate the robotic arm to unfold the garment, then fold it
- HangCloth
- Single task
- Initial state: Hanger is randomly placed, garment is randomly positioned on the table
- Manipulation task: Operate the robotic arm to thread the hanger through the garment, then hang it on the rod
- TeeShirtSort
- Garment classification and arrangement task
- Initial state: Randomly pick a garment from the laundry basket
- Classification: Determine whether the garment is a T-shirt or a dress shirt
- Manipulation task:
- If it is a T-shirt, fold the garment
- If it is a dress shirt, expose the collar, then push it to one side of the table
- FlattenFold
- Count of the dataset
Task Base (episodes) DAgger (episodes) Total FlattenFold 3,055 3,457 6,512 HangCloth 6954 686 7640 TeeShirtSort 5988 - 5988 Total 19,608 4,143 23,751
Dataset Structure
Folder hierarchy
Under each task directory, data is partitioned into two subsets: base and dagger.
- base contains original demonstration trajectories of robotic arm manipulation for garment arrangement tasks.
- dagger
contains on-policy recovery trajectories collected via iterative DAgger, designed to populate failure recovery modes absent in static demonstrations.
Kai0-data/
├── FlattenFold/
│ ├── base/
│ │ ├── data/
│ │ │ ├── chunk-000/
│ │ │ │ ├── episode_000000.parquet
│ │ │ │ ├── episode_000001.parquet
│ │ │ │ └── ...
│ │ │ └── ...
│ │ ├── videos/
│ │ │ ├── chunk-000/
│ │ │ │ ├── observation.images.hand_left/
│ │ │ │ │ ├── episode_000000.mp4
│ │ │ │ │ ├── episode_000001.mp4
│ │ │ │ │ └── ...
│ │ │ │ ├── observation.images.hand_right/
│ │ │ │ │ ├── episode_000000.mp4
│ │ │ │ │ ├── episode_000001.mp4
│ │ │ │ │ └── ...
│ │ │ │ ├── observation.images.top_head/
│ │ │ │ │ ├── episode_000000.mp4
│ │ │ │ │ ├── episode_000001.mp4
│ │ │ │ │ └── ...
│ │ │ │ └── ...
│ │ │ └── ...
│ │ └── meta/
│ │ ├── info.json
│ │ ├── episodes.jsonl
│ │ ├── tasks.jsonl
│ │ └── episodes_stats.jsonl
│ └── dagger/
├── HangCloth/
│ ├── base/
│ └── dagger/
├── TeeShirtSort/
│ ├── base/
│ └── dagger/
└── README.md
Details
info.json
the basic struct of the info.json
{
"codebase_version": "v2.1",
"robot_type": "agilex",
"total_episodes": ...,
"total_frames": ...,
"total_tasks": ...,
"total_videos": ...,
"total_chunks": ...,
"chunks_size": ...,
"fps": ...,
"splits": {
"train": ...
},
"data_path": "data/chunk-{episode_chunk:03d}/episode_{episode_index:06d}.parquet",
"video_path": "videos/chunk-{episode_chunk:03d}/{video_key}/episode_{episode_index:06d}.mp4",
"features": {
"observation.images.top_head": {
"dtype": "video",
"shape": [
480,
640,
3
],
"names": [
"height",
"width",
"channel"
],
"info": {
"video.height": 480,
"video.width": 640,
"video.codec": "av1",
"video.pix_fmt": "yuv420p",
"video.is_depth_map": false,
"video.fps": 30,
"video.channels": 3,
"has_audio": false
}
},
"observation.images.hand_left": {
...
},
"observation.images.hand_right": {
...
},
"observation.state": {
"dtype": "float32",
"shape": [
14
],
"names": null
},
"action": {
"dtype": "float32",
"shape": [
14
],
"names": null
},
"timestamp": {
"dtype": "float32",
"shape": [
1
],
"names": null
},
"frame_index": {
"dtype": "int64",
"shape": [
1
],
"names": null
},
"episode_index": {
"dtype": "int64",
"shape": [
1
],
"names": null
},
"index": {
"dtype": "int64",
"shape": [
1
],
"names": null
},
"task_index": {
"dtype": "int64",
"shape": [
1
],
"names": null
}
}
}
Parquet file format
| Field Name | shape | Meaning |
|---|---|---|
| observation.state | [N, 14] | left [:, :6], right [:, 7:13], joint angleleft [:, 6], right [:, 13] , gripper open range |
| action | [N, 14] | left [:, :6], right [:, 7:13], joint angleleft [:, 6], right [:, 13] , gripper open range |
| timestamp | [N, 1] | Time elapsed since the start of the episode (in seconds) |
| frame_index | [N, 1] | Index of this frame within the current episode (0-indexed) |
| episode_index | [N, 1] | Index of the episode this frame belongs to |
| index | [N, 1] | Global unique index across all frames in the dataset |
| task_index | [N, 1] | Index identifying the task type being performed |
tasks.jsonl
Contains task language prompts (natural language instructions) that specify the manipulation task to be performed. Each entry maps a task_index to its corresponding task description, which can be used for language-conditioned policy training.
Download the Dataset
Python Script
from huggingface_hub import hf_hub_download, snapshot_download
from datasets import load_dataset
# Download a single file
hf_hub_download(
repo_id="OpenDriveLab-org/kai0",
filename="episodes.jsonl",
subfolder="meta",
repo_type="dataset",
local_dir="where/you/want/to/save"
)
# Download a specific folder
snapshot_download(
repo_id="OpenDriveLab-org/kai0",
local_dir="/where/you/want/to/save",
repo_type="dataset",
allow_patterns=["data/*"]
)
# Load the entire dataset
dataset = load_dataset("OpenDriveLab-org/kai0")
Terminal (CLI)
# Download a single file
hf download OpenDriveLab-org/kai0 \
--include "meta/info.json" \
--repo-type dataset \
--local-dir "/where/you/want/to/save"
# Download a specific folder
hf download OpenDriveLab-org/kai0 \
--repo-type dataset \
--include "meta/*" \
--local-dir "/where/you/want/to/save"
# Download the entire dataset
hf download OpenDriveLab-org/kai0 \
--repo-type dataset \
--local-dir "/where/you/want/to/save"
Load the dataset
For LeRobot version < 0.4.0
Choose the appropriate import based on your version:
| Version | Import Path |
|---|---|
<= 0.1.0 |
from lerobot.common.datasets.lerobot_dataset import LeRobotDataset |
> 0.1.0 and < 0.4.0 |
from lerobot.datasets.lerobot_dataset import LeRobotDataset |
# For version <= 0.1.0
from lerobot.common.datasets.lerobot_dataset import LeRobotDataset
# For version > 0.1.0 and < 0.4.0
from lerobot.datasets.lerobot_dataset import LeRobotDataset
# Load the dataset
dataset = LeRobotDataset(repo_id='where/the/dataset/you/stored')
For LeRobot version >= 0.4.0
You need to migrate the dataset from v2.1 to v3.0 first. See the official documentation: Migrate the dataset from v2.1 to v3.0
python -m lerobot.datasets.v30.convert_dataset_v21_to_v30 --repo-id=<HF_USER/DATASET_ID>
⚠️ !!! 等待信息填充
License and Citation
All the data and code within this repo are under . Please consider citing our project if it helps your research.
@misc{,
title={},
author={},
howpublished={\url{}},
year={}
}
- Downloads last month
- -