Update README.md
Browse files
README.md
CHANGED
|
@@ -79,6 +79,28 @@ Each sample in the Parquet files contains:
|
|
| 79 |
### Key Learning Signals
|
| 80 |
For every observable, the dataset provides: `ideal_expval_*`, `noisy_expval_*`, `error_*`, `sign_ideal_*`, `sign_noisy_*`. This enables complex regression and classification tasks for noise modeling.
|
| 81 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 82 |
### Load the Dataset
|
| 83 |
The dataset is stored in Parquet format inside the `data/shards/` folder. You can load it directly using the Hugging Face `datasets` library:
|
| 84 |
|
|
|
|
| 79 |
### Key Learning Signals
|
| 80 |
For every observable, the dataset provides: `ideal_expval_*`, `noisy_expval_*`, `error_*`, `sign_ideal_*`, `sign_noisy_*`. This enables complex regression and classification tasks for noise modeling.
|
| 81 |
|
| 82 |
+
## QSBench-Depolarizing: Quantum Error Prediction
|
| 83 |
+
|
| 84 |
+
**You don't need a PhD in Quantum Physics to use this dataset.** Think of this as a classic **Predictive Maintenance** or **Regression** problem. We have a machine (the quantum computer) that makes errors. We give you the blueprints of the tasks it ran, and the magnitude of the errors it made.
|
| 85 |
+
|
| 86 |
+
### The ML Mission: Supervised Regression
|
| 87 |
+
|
| 88 |
+
Your goal is to predict the `error` without running expensive physics simulations. Can you train a Gradient Boosting model (XGBoost/LightGBM) or a Neural Network to predict the output error based purely on the circuit's structural features?
|
| 89 |
+
|
| 90 |
+
### Dataset Anatomy (Features & Targets)
|
| 91 |
+
|
| 92 |
+
Use the structural features as `X`, and the errors as `y`.
|
| 93 |
+
|
| 94 |
+
| Group | Column Name | What is it for ML? |
|
| 95 |
+
| :--- | :--- | :--- |
|
| 96 |
+
| **Features (X)** | `depth`, `gate_entropy`, `cx_count` | The structural complexity of the task. |
|
| 97 |
+
| **Features (X)** | `noise_prob`, `shots` | The environmental conditions (error probability and sampling rate). |
|
| 98 |
+
| **Target (y)** | `error_Z_global` | **The Main Target.** The continuous error value you want to predict. |
|
| 99 |
+
| **Target (y)** | `sign_ideal_Z`, `sign_noisy_Z` | **For Classification.** Did the noise flip the final answer? (Binary target). |
|
| 100 |
+
|
| 101 |
+
### Quick Start Idea
|
| 102 |
+
Build a robust XGBoost regressor using `total_gates`, `depth`, and `noise_prob` to predict `error_Z_global`. What is your Mean Absolute Error (MAE)?
|
| 103 |
+
|
| 104 |
### Load the Dataset
|
| 105 |
The dataset is stored in Parquet format inside the `data/shards/` folder. You can load it directly using the Hugging Face `datasets` library:
|
| 106 |
|