Structured State Space Models with Locality-Sensitive Hashing for High Energy Physics
Accepted at AISTATS 2025
python3 tracking_trainer.py -m modelsThis repository implements state-of-the-art linear-complexity architectures for particle tracking:
| Model Name | Architecture | Reference/Notes |
|---|---|---|
dgcnn, gravnet |
DGCNN/GravNet | Dynamic Graph CNN (arXiv:1801.07829) |
rwkv,rwkv7 |
RWKV/RWKV7 | RWKV (arXiv:2305.13048) |
hept |
HEPT | LSH-Based Efficient Point Transformer (arXiv:2402.12535) |
flatformer |
FlatFormer | Flattened Window Attention (arXiv:2301.08739) |
fullmamba2 |
Mamba | Mamba (arXiv:2312.00752) |
gatedelta |
Gated DeltaNet | Gated DeltaNet (arXiv:2412.06464) |
hmambav1 |
Mamba-b (E2LSH variant) | Uses E2LSH partitioning before Mamba |
hmambav2 |
Mamba-b (LSH embedding) | Uses LSH embeddings before Mamba |
lshgd |
Mamba-b (LSH + Linear RNN) | LSH within linear RNN |
fullhybrid2 |
Mamba-a (HEPT/Mamba mix) | Partial hybrid layer balancing LSH-Based Mamba Variants (arXiv:2501.16237) |
hydra |
Mamba-a (Hybrid Hydra) | Quasi-separable Mixer hybrid layer |
fullfullhybrid2 |
Enhanced Hybrid SSM | Full hybrid configuration |
pemamba2 |
Mamba2 + Sliding Window | Mamba2 fused with flatten/sliding window sorting/grouping |
gdlocal1 |
Local Aggregation SSM | Experimental local aggregation layer with SSM |
Model performance is highly sensitive to:
- Learning rate schedules
- Optimizer configurations
- Hybrid layer balancing ratios
Optimal configurations vary significantly between architectures
- Combining best elements of attention, RNN, and SSM architectures
- Novel hybrid approaches balancing computational efficiency and physics performance
- Efficient models for hit level tasks
Dataset: Trained on TrackML (6-60k hits per event). Preprocessing codes modified from HEPT and GNN_Tracking.
Acknowledgments We thank the TrackML challenge for providing the dataset and acknowledge the following papers that inspired this work:
[1]: arXiv:2312.03823 [2]: arXiv:2402.12535 [3]: arXiv:2407.13925
Stay connected for updates!
