| --- |
| configs: |
| - config_name: raw |
| data_files: |
| - split: train |
| path: data/raw/*.parquet |
| - config_name: normalized |
| data_files: |
| - split: train |
| path: data/normalized/*.parquet |
| --- |
| |
| # 🏭 FactoryNet: A Unified Multi-Machine Industrial Dataset |
|
|
| ## Overview |
| FactoryNet is a large-scale, machine-learning-ready foundation dataset for industrial robotics and manufacturing anomaly detection. Historically, industrial datasets have been heavily siloed—every manufacturer and research team uses different column names, units, and structures. |
|
|
| FactoryNet solves this by forging massive, high-frequency physical datasets from completely different machines into a **single, mathematically unified coordinate system**. By standardizing axes, effort signals, and kinematic feedback, this dataset allows neural networks to learn universal physical relationships across different hardware boundaries. |
|
|
| ## 📦 Dataset Composition |
| This repository currently contains millions of rows of high-frequency sensor data merged from three distinct open-source industrial datasets: |
|
|
| ### 1. UMich CNC Mill Tool Wear Dataset |
| * **Machine:** 3-Axis CNC Mill |
| * **Task:** Machining wax blocks under varying feedrates and clamp pressures. |
| * **Anomalies:** Tool wear (Unworn vs. Worn) and visual inspection failures. |
| * **Original Source:** University of Michigan (via Kaggle) |
|
|
| ### 2. AURSAD (Automated UR3e Screwdriving Anomaly Dataset) |
| * **Machine:** UR3e 6-Axis Collaborative Robot |
| * **Task:** Automated screwdriving using an OnRobot Screwdriver. |
| * **Anomalies:** Normal operation, damaged screws, missing screws, extra parts, and damaged threads. |
| * **Original Source:** Zenodo (Record 4487073) |
|
|
| ### 3. voraus-AD (Yu-Cobot Pick-and-Place) |
| * **Machine:** Yu-Cobot 6-Axis Collaborative Robot |
| * **Task:** Industrial pick-and-place task on a conveyor belt. |
| * **Anomalies:** 12 diverse physical anomalies including axis wear (friction/miscommutation), gripping errors, collisions, and added axis weights. |
| * **Original Source:** voraus robotik (via Kaggle) |
|
|
| --- |
|
|
| ## 🏗️ The Unified Schema |
| To allow cross-machine learning, all raw variables (which previously had over 300 conflicting names) have been mapped to a standardized `FactoryNet` schema. |
|
|
| **Standardized Prefix Naming:** |
| * `setpoint_*`: The commanded target from the controller (e.g., `setpoint_pos_0`). |
| * `feedback_*`: The actual measured state from the sensors (e.g., `feedback_vel_1`). |
| * `effort_*`: The physical force/current applied (e.g., `effort_current_2`, `effort_torque_0`). |
| * `ctx_*`: Contextual metadata (e.g., `ctx_anomaly_label`, `ctx_busvoltage_0`). |
|
|
| **Standardized Axis Indexing:** |
| Regardless of how the original manufacturer numbered their joints (X/Y/Z or 1-6), all axes in this dataset are strictly zero-indexed (`0` through `5`). |
|
|
| --- |
|
|
| ## ⚙️ Configurations |
| This dataset is partitioned into highly compressed Parquet files and is available in two configurations: |
|
|
| 1. **`raw`**: The original physical values (Amps, Volts, Radians, etc.) mapped directly into the new schema. Best for physics-informed neural networks or domain-specific thresholding. |
| 2. **`normalized`**: All continuous physical variables have been independently standardized using a Z-score Scaler (`StandardScaler`) fitted specifically to that machine's domain. Best for immediate deep learning and foundation model training. |
|
|
| ## 🚀 How to Use (Python) |
| Because this dataset is partitioned using Parquet, you can load the entire multi-gigabyte repository without crashing your local RAM. |
|
|
| ```python |
| from datasets import load_dataset |
| |
| # Load the mathematically normalized dataset for AI training |
| dataset = load_dataset("karimm6/FactoryNet_Dataset", "normalized") |
| |
| # Convert to a Pandas DataFrame |
| df = dataset['train'].to_pandas() |
| print(df['machine_type'].value_counts()) |