Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type
struct<architecture: string, d_in: int64, d_sae: int64, activation_fn_str: string, apply_b_dec_to_input: bool, finetuning_scaling_factor: bool, context_size: int64, model_name: string, hook_name: string, hook_layer: int64, hook_head_index: null, prepend_bos: bool, dataset_path: string, dataset_trust_remote_code: bool, normalize_activations: null, dtype: string, device: string, sae_lens_training_version: null, activation_fn_kwargs: struct<>, neuronpedia_id: string, model_from_pretrained_kwargs: struct<>, seqpos_slice: list<item: null>>
to
{'model_name': Value('string'), 'd_in': Value('int64'), 'd_sae': Value('int64'), 'hook_layer': Value('int64'), 'hook_name': Value('string'), 'is_transcoder': Value('bool'), 'input_hook_name': Value('string'), 'output_hook_name': Value('string'), 'context_size': Value('null'), 'hook_head_index': Value('null'), 'architecture': Value('string'), 'apply_b_dec_to_input': Value('null'), 'finetuning_scaling_factor': Value('null'), 'activation_fn_str': Value('string'), 'prepend_bos': Value('bool'), 'normalize_activations': Value('string'), 'dtype': Value('string'), 'device': Value('string'), 'dataset_path': Value('string'), 'dataset_trust_remote_code': Value('bool'), 'seqpos_slice': List(Value('null')), 'training_tokens': Value('int64'), 'sae_lens_training_version': Value('null'), 'neuronpedia_id': Value('null')}
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/table.py", line 2224, in cast_table_to_schema
                  cast_array_to_feature(
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/table.py", line 1795, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/table.py", line 2092, in cast_array_to_feature
                  raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
              TypeError: Couldn't cast array of type
              struct<architecture: string, d_in: int64, d_sae: int64, activation_fn_str: string, apply_b_dec_to_input: bool, finetuning_scaling_factor: bool, context_size: int64, model_name: string, hook_name: string, hook_layer: int64, hook_head_index: null, prepend_bos: bool, dataset_path: string, dataset_trust_remote_code: bool, normalize_activations: null, dtype: string, device: string, sae_lens_training_version: null, activation_fn_kwargs: struct<>, neuronpedia_id: string, model_from_pretrained_kwargs: struct<>, seqpos_slice: list<item: null>>
              to
              {'model_name': Value('string'), 'd_in': Value('int64'), 'd_sae': Value('int64'), 'hook_layer': Value('int64'), 'hook_name': Value('string'), 'is_transcoder': Value('bool'), 'input_hook_name': Value('string'), 'output_hook_name': Value('string'), 'context_size': Value('null'), 'hook_head_index': Value('null'), 'architecture': Value('string'), 'apply_b_dec_to_input': Value('null'), 'finetuning_scaling_factor': Value('null'), 'activation_fn_str': Value('string'), 'prepend_bos': Value('bool'), 'normalize_activations': Value('string'), 'dtype': Value('string'), 'device': Value('string'), 'dataset_path': Value('string'), 'dataset_trust_remote_code': Value('bool'), 'seqpos_slice': List(Value('null')), 'training_tokens': Value('int64'), 'sae_lens_training_version': Value('null'), 'neuronpedia_id': Value('null')}
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1450, in compute_config_parquet_and_info_response
                  parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
                                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 993, in stream_convert_to_parquet
                  builder._prepare_split(
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/.venv/lib/python3.12/site-packages/datasets/builder.py", line 1858, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

eval_type_id
string
eval_config
dict
eval_id
string
datetime_epoch_millis
int64
eval_result_metrics
dict
eval_result_details
list
sae_bench_commit_hash
string
sae_lens_id
string
sae_lens_release_id
string
sae_lens_version
string
sae_cfg_dict
dict
eval_result_unstructured
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
09b2c7a0-d185-469e-bc5d-4bf8802635a1
1,746,065,695,233
{ "mean": { "mean_absorption_fraction_score": 0.25572986211793886, "mean_full_absorption_score": 0.2562014792971821, "mean_num_split_features": 1.1538461538461537, "std_dev_absorption_fraction_score": 0.14385749153062752, "std_dev_full_absorption_score": 0.13719408933194543, "std_dev_num_split...
[ { "first_letter": "a", "mean_absorption_fraction": 0.34189145094829215, "full_absorption_rate": 0.23590333716915995, "num_full_absorption": 615, "num_probe_true_positives": 2607, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.1265611674747012, ...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_0
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 0, "hook_name": "blocks.0.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.0.ln2.hook_normalized", "output_hook_name": "blocks.0.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
9f2524e1-229f-42e6-94b9-f161828f5fc7
1,746,066,158,194
{ "mean": { "mean_absorption_fraction_score": 0, "mean_full_absorption_score": 0.007794924029876477, "mean_num_split_features": 1, "std_dev_absorption_fraction_score": 0, "std_dev_full_absorption_score": 0.012674181262470488, "std_dev_num_split_features": 0 } }
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.0063813813813813815, "num_full_absorption": 17, "num_probe_true_positives": 2664, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.0006406...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_10
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 10, "hook_name": "blocks.10.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.10.ln2.hook_normalized", "output_hook_name": "blocks.10.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
65725639-5c71-4cec-ad2c-3bd357716462
1,746,067,108,691
{ "mean": { "mean_absorption_fraction_score": 0, "mean_full_absorption_score": 0.03505371828368144, "mean_num_split_features": 1.1923076923076923, "std_dev_absorption_fraction_score": 0, "std_dev_full_absorption_score": 0.06737300689952094, "std_dev_num_split_features": 0.4914656259988704 } ...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.015337423312883436, "num_full_absorption": 40, "num_probe_true_positives": 2608, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.00722456...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_11
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 11, "hook_name": "blocks.11.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.11.ln2.hook_normalized", "output_hook_name": "blocks.11.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
1ffc0f67-c8d7-439d-b1c2-11f86bf90451
1,746,065,171,793
{ "mean": { "mean_absorption_fraction_score": 0, "mean_full_absorption_score": 0.010457973883693827, "mean_num_split_features": 1.2692307692307692, "std_dev_absorption_fraction_score": 0, "std_dev_full_absorption_score": 0.01488211860166244, "std_dev_num_split_features": 0.533493565673837 } ...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.0011295180722891566, "num_full_absorption": 3, "num_probe_true_positives": 2656, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.00243902...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_12
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.12.ln2.hook_normalized", "output_hook_name": "blocks.12.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
8e57fe8c-a41b-48b9-88d5-d41bbf8e807c
1,746,066,236,092
{ "mean": { "mean_absorption_fraction_score": 0.0000746825989544436, "mean_full_absorption_score": 0.02504067735879885, "mean_num_split_features": 1.1538461538461537, "std_dev_absorption_fraction_score": 0.00038080802939453206, "std_dev_full_absorption_score": 0.058033308159135835, "std_dev_nu...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.006784771956275914, "num_full_absorption": 18, "num_probe_true_positives": 2653, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.00188205...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_13
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 13, "hook_name": "blocks.13.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.13.ln2.hook_normalized", "output_hook_name": "blocks.13.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
ff8813b8-9890-43da-bf3b-bae9b218b90f
1,746,067,464,960
{ "mean": { "mean_absorption_fraction_score": 0.009259369507170009, "mean_full_absorption_score": 0.021173226463979303, "mean_num_split_features": 1, "std_dev_absorption_fraction_score": 0.025326944111465348, "std_dev_full_absorption_score": 0.026049484884275033, "std_dev_num_split_features": ...
[ { "first_letter": "a", "mean_absorption_fraction": 0.04871320699759635, "full_absorption_rate": 0.05100039231071008, "num_full_absorption": 130, "num_probe_true_positives": 2549, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_14
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 14, "hook_name": "blocks.14.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.14.ln2.hook_normalized", "output_hook_name": "blocks.14.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
7110633c-12d5-447f-a2db-702a0f0f1218
1,746,065,512,141
{ "mean": { "mean_absorption_fraction_score": 0.05137673751203079, "mean_full_absorption_score": 0.05375001966779903, "mean_num_split_features": 1.1923076923076923, "std_dev_absorption_fraction_score": 0.17121124751076017, "std_dev_full_absorption_score": 0.09924068231759409, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.006047808181894802, "full_absorption_rate": 0.018304071722076952, "num_full_absorption": 49, "num_probe_true_positives": 2677, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorptio...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_15
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 15, "hook_name": "blocks.15.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.15.ln2.hook_normalized", "output_hook_name": "blocks.15.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
cedb354d-8bdf-42cf-8c35-1eb4aee75d31
1,746,066,733,522
{ "mean": { "mean_absorption_fraction_score": 0.020319784078237455, "mean_full_absorption_score": 0.043955573795724255, "mean_num_split_features": 1.1923076923076923, "std_dev_absorption_fraction_score": 0.07013801571020624, "std_dev_full_absorption_score": 0.051440204551611045, "std_dev_num_s...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.04081632653061224, "num_full_absorption": 106, "num_probe_true_positives": 2597, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.0006600660066006601, "full_absorpti...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_16
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 16, "hook_name": "blocks.16.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.16.ln2.hook_normalized", "output_hook_name": "blocks.16.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
354fa2e5-32a6-43da-8c0e-a1a618557b02
1,746,068,008,973
{ "mean": { "mean_absorption_fraction_score": 0.049789396911220814, "mean_full_absorption_score": 0.07659270199152735, "mean_num_split_features": 1.4230769230769231, "std_dev_absorption_fraction_score": 0.17383489494279483, "std_dev_full_absorption_score": 0.16711855595130418, "std_dev_num_spl...
[ { "first_letter": "a", "mean_absorption_fraction": 0.12018135181039993, "full_absorption_rate": 0.19240986717267552, "num_full_absorption": 507, "num_probe_true_positives": 2635, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_17
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 17, "hook_name": "blocks.17.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.17.ln2.hook_normalized", "output_hook_name": "blocks.17.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
975c57b0-8bec-4d73-9ecb-67d0ef23c9ce
1,746,065,553,668
{ "mean": { "mean_absorption_fraction_score": 0.0073505522561306755, "mean_full_absorption_score": 0.06242861828990722, "mean_num_split_features": 1.5, "std_dev_absorption_fraction_score": 0.02876412790969379, "std_dev_full_absorption_score": 0.048030904564926626, "std_dev_num_split_features":...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.09433244579688094, "num_full_absorption": 248, "num_probe_true_positives": 2629, "num_split_features": 4 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.06349206...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_18
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 18, "hook_name": "blocks.18.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.18.ln2.hook_normalized", "output_hook_name": "blocks.18.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
58afefaa-5b36-4e8d-84fe-bdec750ae3e6
1,746,066,829,266
{ "mean": { "mean_absorption_fraction_score": 0.056738743123709016, "mean_full_absorption_score": 0.07578111939996313, "mean_num_split_features": 1.6923076923076923, "std_dev_absorption_fraction_score": 0.1912298805813828, "std_dev_full_absorption_score": 0.16022511008025797, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.03693728357060408, "num_full_absorption": 96, "num_probe_true_positives": 2599, "num_split_features": 3 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.034810126...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_19
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 19, "hook_name": "blocks.19.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.19.ln2.hook_normalized", "output_hook_name": "blocks.19.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
3f31700b-7dfb-440b-a321-e1bd6ce9b040
1,746,066,671,600
{ "mean": { "mean_absorption_fraction_score": 0.05317635824352028, "mean_full_absorption_score": 0.04422196717544969, "mean_num_split_features": 1.1538461538461537, "std_dev_absorption_fraction_score": 0.10792119534332006, "std_dev_full_absorption_score": 0.06595737054282579, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.0026626093571700264, "num_full_absorption": 7, "num_probe_true_positives": 2629, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.02385436...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_1
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 1, "hook_name": "blocks.1.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.1.ln2.hook_normalized", "output_hook_name": "blocks.1.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
fd343edf-5330-47b4-8251-4a688f07f40a
1,746,068,103,361
{ "mean": { "mean_absorption_fraction_score": 0.013412312681415498, "mean_full_absorption_score": 0.03827235531782512, "mean_num_split_features": 1.8461538461538463, "std_dev_absorption_fraction_score": 0.042859934108605606, "std_dev_full_absorption_score": 0.023218790311294472, "std_dev_num_s...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.048368953880764905, "num_full_absorption": 129, "num_probe_true_positives": 2667, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.0183776...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_20
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 20, "hook_name": "blocks.20.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.20.ln2.hook_normalized", "output_hook_name": "blocks.20.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
1800e5f9-cf3f-4fdd-9dbf-752da24f2355
1,746,065,606,317
{ "mean": { "mean_absorption_fraction_score": 0.0057769968750910235, "mean_full_absorption_score": 0.055624122003544794, "mean_num_split_features": 1.9615384615384615, "std_dev_absorption_fraction_score": 0.014614507555562912, "std_dev_full_absorption_score": 0.027833585722238766, "std_dev_num...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.06856060606060606, "num_full_absorption": 181, "num_probe_true_positives": 2640, "num_split_features": 3 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.03650094...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_21
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 21, "hook_name": "blocks.21.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.21.ln2.hook_normalized", "output_hook_name": "blocks.21.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
649a52a5-c842-4e71-80f9-9b965b320abe
1,746,066,942,339
{ "mean": { "mean_absorption_fraction_score": 0.0009887738887391561, "mean_full_absorption_score": 0.10316414812698411, "mean_num_split_features": 1.6153846153846154, "std_dev_absorption_fraction_score": 0.001854029856933167, "std_dev_full_absorption_score": 0.06417304002163213, "std_dev_num_s...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0073077284353189, "full_absorption_rate": 0.26918757019842754, "num_full_absorption": 719, "num_probe_true_positives": 2671, "num_split_features": 4 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_22
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 22, "hook_name": "blocks.22.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.22.ln2.hook_normalized", "output_hook_name": "blocks.22.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
55c379e4-0ab9-476c-bfe8-eddc39915508
1,746,068,436,141
{ "mean": { "mean_absorption_fraction_score": 0.008602726617980108, "mean_full_absorption_score": 0.2130965192130354, "mean_num_split_features": 1.6153846153846154, "std_dev_absorption_fraction_score": 0.01900294806070679, "std_dev_full_absorption_score": 0.1380113263045151, "std_dev_num_split...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.3097412480974125, "num_full_absorption": 814, "num_probe_true_positives": 2628, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.03653250773993808, "full_absorption_...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_23
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 23, "hook_name": "blocks.23.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.23.ln2.hook_normalized", "output_hook_name": "blocks.23.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
547c1a43-32d4-4f24-975a-25449ef542b4
1,746,110,852,977
{ "mean": { "mean_absorption_fraction_score": 0.006876101936212297, "mean_full_absorption_score": 0.3244397048626903, "mean_num_split_features": 1.6923076923076923, "std_dev_absorption_fraction_score": 0.015451636681990502, "std_dev_full_absorption_score": 0.2711199946910568, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.15171650055370986, "num_full_absorption": 411, "num_probe_true_positives": 2709, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.0012254901960784314, "full_absorpti...
38aad45bfffabfa690b49eb8227bef04c594ab56
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_24
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 24, "hook_name": "blocks.24.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.24.ln2.hook_normalized", "output_hook_name": "blocks.24.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
d913901c-ffdf-408d-806d-f53c5a891c26
1,746,106,057,000
{ "mean": { "mean_absorption_fraction_score": 0.03231435416780042, "mean_full_absorption_score": 0.5786868767647252, "mean_num_split_features": 1.3846153846153846, "std_dev_absorption_fraction_score": 0.0883412531662851, "std_dev_full_absorption_score": 0.29333325534416066, "std_dev_num_split_...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.1494386372435153, "num_full_absorption": 386, "num_probe_true_positives": 2583, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.0006218905472636816, "full_absorptio...
a63ca26185de2c05d00bcb739b6c17c82a0dbff7
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_25
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 25, "hook_name": "blocks.25.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.25.ln2.hook_normalized", "output_hook_name": "blocks.25.hook_mlp_out", "context_size": null, "hook_head_index": null, "archi...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
95548cf4-94f7-47aa-9518-8b406e289ff7
1,746,067,653,684
{ "mean": { "mean_absorption_fraction_score": 0.04861870765256725, "mean_full_absorption_score": 0.02131302734589898, "mean_num_split_features": 1.0769230769230769, "std_dev_absorption_fraction_score": 0.10916548580973126, "std_dev_full_absorption_score": 0.03770867258101237, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.03205847631759462, "full_absorption_rate": 0.010824934677118328, "num_full_absorption": 29, "num_probe_true_positives": 2679, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_2
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 2, "hook_name": "blocks.2.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.2.ln2.hook_normalized", "output_hook_name": "blocks.2.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
976be5d8-14a8-4b16-9fcc-4d41a586eb48
1,746,065,142,337
{ "mean": { "mean_absorption_fraction_score": 0.0023626426415846104, "mean_full_absorption_score": 0.016670118213491337, "mean_num_split_features": 1.3846153846153846, "std_dev_absorption_fraction_score": 0.005433316216893815, "std_dev_full_absorption_score": 0.017310148487458526, "std_dev_num...
[ { "first_letter": "a", "mean_absorption_fraction": 0.010332950631458095, "full_absorption_rate": 0.02104860313815538, "num_full_absorption": 55, "num_probe_true_positives": 2613, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_3
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 3, "hook_name": "blocks.3.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.3.ln2.hook_normalized", "output_hook_name": "blocks.3.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
9868442a-aaf2-48a6-bea3-4a396d79ba5b
1,746,066,136,314
{ "mean": { "mean_absorption_fraction_score": 0.018798698628177072, "mean_full_absorption_score": 0.008621089857011124, "mean_num_split_features": 1.3076923076923077, "std_dev_absorption_fraction_score": 0.03082976833711388, "std_dev_full_absorption_score": 0.008434071400675711, "std_dev_num_s...
[ { "first_letter": "a", "mean_absorption_fraction": 0.1153270678123997, "full_absorption_rate": 0.023927079377136347, "num_full_absorption": 63, "num_probe_true_positives": 2633, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_4
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 4, "hook_name": "blocks.4.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.4.ln2.hook_normalized", "output_hook_name": "blocks.4.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
a122ff61-a8af-48e5-b02b-7d01f75812d7
1,746,067,276,542
{ "mean": { "mean_absorption_fraction_score": 0.23897883572973036, "mean_full_absorption_score": 0.08521529318441, "mean_num_split_features": 1.1923076923076923, "std_dev_absorption_fraction_score": 0.27226946733781404, "std_dev_full_absorption_score": 0.0844272485944821, "std_dev_num_split_fe...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0003912363067292645, "full_absorption_rate": 0.0019561815336463224, "num_full_absorption": 5, "num_probe_true_positives": 2556, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.38583092190610563, ...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_5
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 5, "hook_name": "blocks.5.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.5.ln2.hook_normalized", "output_hook_name": "blocks.5.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
5f38aaa8-fb19-449b-a86f-3c0149bec894
1,746,065,290,154
{ "mean": { "mean_absorption_fraction_score": 0.00014105059876284658, "mean_full_absorption_score": 0.015998867691655233, "mean_num_split_features": 1.2307692307692308, "std_dev_absorption_fraction_score": 0.0004700524410912506, "std_dev_full_absorption_score": 0.03431134861636546, "std_dev_nu...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.006925740669488265, "num_full_absorption": 18, "num_probe_true_positives": 2599, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.00183936...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_6
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 6, "hook_name": "blocks.6.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.6.ln2.hook_normalized", "output_hook_name": "blocks.6.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
97ef61f1-645c-498a-bda5-b69a94c3adf5
1,746,066,394,997
{ "mean": { "mean_absorption_fraction_score": 0.0005686875190793884, "mean_full_absorption_score": 0.017808889412253063, "mean_num_split_features": 1, "std_dev_absorption_fraction_score": 0.0011112231670793514, "std_dev_full_absorption_score": 0.029792101721446532, "std_dev_num_split_features"...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.007392996108949416, "num_full_absorption": 19, "num_probe_true_positives": 2570, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0, "nu...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_7
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 7, "hook_name": "blocks.7.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.7.ln2.hook_normalized", "output_hook_name": "blocks.7.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
dc4d2962-606d-4e5d-86a4-2382ae8cfa3d
1,746,067,441,486
{ "mean": { "mean_absorption_fraction_score": 0, "mean_full_absorption_score": 0.007736365836954568, "mean_num_split_features": 1.1538461538461537, "std_dev_absorption_fraction_score": 0, "std_dev_full_absorption_score": 0.004909810203060813, "std_dev_num_split_features": 0.36794648440311994 ...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.003423354887790034, "num_full_absorption": 9, "num_probe_true_positives": 2629, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.003893575...
e3d1c8521c23f1d1c762f4c53ba21a789b72c66b
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_8
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 8, "hook_name": "blocks.8.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.8.ln2.hook_normalized", "output_hook_name": "blocks.8.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
f0035da8-6f39-4caa-a7fc-1acbe4c8de1c
1,746,065,183,748
{ "mean": { "mean_absorption_fraction_score": 0, "mean_full_absorption_score": 0.04002150487262049, "mean_num_split_features": 1.3846153846153846, "std_dev_absorption_fraction_score": 0, "std_dev_full_absorption_score": 0.08759441278320237, "std_dev_num_split_features": 0.5710988059467872 } ...
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.001877581674802854, "num_full_absorption": 5, "num_probe_true_positives": 2663, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.004388714...
21364d77bbbc5c28c5b84ec554481897bb07ed4c
custom_sae
gemma-2-2b_gemma_scope_transcoder_layer_9
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 9, "hook_name": "blocks.9.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.9.ln2.hook_normalized", "output_hook_name": "blocks.9.hook_mlp_out", "context_size": null, "hook_head_index": null, "architect...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
bf5105d2-375d-41e5-95a9-d728bde0dffe
1,745,853,295,554
{ "mean": { "mean_absorption_fraction_score": 0.00038381545558513393, "mean_full_absorption_score": 0.003947541171647063, "mean_num_split_features": 1.4230769230769231, "std_dev_absorption_fraction_score": 0.0015467170605716581, "std_dev_full_absorption_score": 0.0033400271853359334, "std_dev_...
[ { "first_letter": "a", "mean_absorption_fraction": 0.007882075025248524, "full_absorption_rate": 0.009829867674858222, "num_full_absorption": 26, "num_probe_true_positives": 2645, "num_split_features": 3 }, { "first_letter": "b", "mean_absorption_fraction": 0.0006357279084551812,...
21c67da73215cdad5893af8d2c141ba11a579d3a
custom_sae
gemma-2-2b_layer_0_k10_normalized_sparse_mlp_neuron_sae
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 9216, "hook_layer": 0, "hook_name": "blocks.0.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.0.ln2.hook_normalized", "output_hook_name": "blocks.0.hook_mlp_out", "context_size": null, "hook_head_index": null, "architectu...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
bf5105d2-375d-41e5-95a9-d728bde0dffe
1,745,850,571,415
{ "mean": { "mean_absorption_fraction_score": 0.000398276522719643, "mean_full_absorption_score": 0.003983889763479281, "mean_num_split_features": 1.3846153846153846, "std_dev_absorption_fraction_score": 0.0016196638991941765, "std_dev_full_absorption_score": 0.003752171704905972, "std_dev_num...
[ { "first_letter": "a", "mean_absorption_fraction": 0.00825806277074576, "full_absorption_rate": 0.010964083175803403, "num_full_absorption": 29, "num_probe_true_positives": 2645, "num_split_features": 3 }, { "first_letter": "b", "mean_absorption_fraction": 0.0006357279084551812, ...
21c67da73215cdad5893af8d2c141ba11a579d3a
custom_sae
gemma-2-2b_layer_0_k10_sparse_mlp_neuron_sae
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 9216, "hook_layer": 0, "hook_name": "blocks.0.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.0.ln2.hook_normalized", "output_hook_name": "blocks.0.hook_mlp_out", "context_size": null, "hook_head_index": null, "architectu...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
bf5105d2-375d-41e5-95a9-d728bde0dffe
1,745,845,210,186
{ "mean": { "mean_absorption_fraction_score": 0.0007383175610108474, "mean_full_absorption_score": 0.0007699153645995467, "mean_num_split_features": 1.3846153846153846, "std_dev_absorption_fraction_score": 0.0018839309845189401, "std_dev_full_absorption_score": 0.0010871191111761676, "std_dev_...
[ { "first_letter": "a", "mean_absorption_fraction": 0.00888176847270442, "full_absorption_rate": 0.004158790170132325, "num_full_absorption": 11, "num_probe_true_positives": 2645, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.0016412309586690775, ...
21c67da73215cdad5893af8d2c141ba11a579d3a
custom_sae
gemma-2-2b_layer_0_mlp_neuron_sae
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 9216, "hook_layer": 0, "hook_name": "blocks.0.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.0.ln2.hook_normalized", "output_hook_name": "blocks.0.hook_mlp_out", "context_size": null, "hook_head_index": null, "architectu...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
bf5105d2-375d-41e5-95a9-d728bde0dffe
1,745,847,834,534
{ "mean": { "mean_absorption_fraction_score": 0.0005943980991488938, "mean_full_absorption_score": 0.000949496989201878, "mean_num_split_features": 1.3076923076923077, "std_dev_absorption_fraction_score": 0.001703234060808008, "std_dev_full_absorption_score": 0.0014202824704433025, "std_dev_nu...
[ { "first_letter": "a", "mean_absorption_fraction": 0.008517523181203733, "full_absorption_rate": 0.005293005671077505, "num_full_absorption": 14, "num_probe_true_positives": 2645, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.0010542068720967213,...
21c67da73215cdad5893af8d2c141ba11a579d3a
custom_sae
gemma-2-2b_layer_0_normalized_mlp_neuron_sae
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 9216, "hook_layer": 0, "hook_name": "blocks.0.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.0.ln2.hook_normalized", "output_hook_name": "blocks.0.hook_mlp_out", "context_size": null, "hook_head_index": null, "architectu...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 68, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
849f0c7e-dd14-4570-b405-436635980276
1,746,011,568,774
{ "mean": { "mean_absorption_fraction_score": 0, "mean_full_absorption_score": 0.0007496566858797136, "mean_num_split_features": 1, "std_dev_absorption_fraction_score": 0, "std_dev_full_absorption_score": 0.002734515139415748, "std_dev_num_split_features": 0 } }
[ { "first_letter": "a", "mean_absorption_fraction": 0, "full_absorption_rate": 0.00037537537537537537, "num_full_absorption": 1, "num_probe_true_positives": 2664, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0, "n...
78124820669cbb9a645a6ac1637e70a8991bdf8d
custom_sae
gemma-2-2b_layer_10_k10_normalized_sparse_mlp_neuron_sae
5.9.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 9216, "hook_layer": 10, "hook_name": "blocks.10.ln2.hook_normalized", "is_transcoder": true, "input_hook_name": "blocks.10.ln2.hook_normalized", "output_hook_name": "blocks.10.hook_mlp_out", "context_size": null, "hook_head_index": null, "archit...
null
End of preview.

No dataset card yet

Downloads last month
15