Arthur Conmy (GDM Account)
ArthurConmyGDM


·
AI & ML interests
Interpretability, AI Safety, AI Alignment
Recent Activity
new activity
23 days ago
google/gemma-scope-2b-pt-res:gemma-2-2b layer 20 SAE width 65k SAE seems very off
updated
a model
about 2 months ago
google/gemma-scope-2b-pt-res
updated
a model
about 2 months ago
google/gemma-scope-2b-pt-res
Organizations
ArthurConmyGDM's activity
gemma-2-2b layer 20 SAE width 65k SAE seems very off
1
#8 opened 24 days ago
by
charlieoneill

Removing SAEs with LR != 7e-5
5
#7 opened 3 months ago
by
Aric
Layer 13 saes raising "zipfile.BadZipFile: File is not a zip file"
5
#5 opened 6 months ago
by
MrGonao

suggestion: notate the canonical SAEs
3
#9 opened 5 months ago
by
dribnet
add experimental embedding SAEs
#4 opened 6 months ago
by
Aric
add experimental embedding SAEs
#7 opened 7 months ago
by
Aric
New table
1
#8 opened 7 months ago
by
ArthurConmyGDM

Link model to paper
#5 opened 7 months ago
by
nielsr

Link dataset to paper
#7 opened 7 months ago
by
nielsr

Link model to paper
#6 opened 7 months ago
by
nielsr

The L0 of the SAE does not quite match
1
#3 opened 7 months ago
by
ShayanShamsi
Update README.md
#5 opened 7 months ago
by
NeelNanda2
Update README.md
#4 opened 7 months ago
by
NeelNanda2
Uploaded demo GIF
#3 opened 7 months ago
by
NeelNanda2
Update README.md
#4 opened 7 months ago
by
ArthurConmy
Update README.md
#2 opened 7 months ago
by
ArthurConmy
Update README.md
#3 opened 8 months ago
by
ArthurConmyGDM

Delete layer_11/width_16k/average_l0_79
1
#2 opened 8 months ago
by
ArthurConmyGDM

Delete layer_11/width_16k/average_l0_79
1
#2 opened 8 months ago
by
ArthurConmyGDM
