Source code for the work A Variance-Reduced and Stabilized Proximal Stochastic Gradient Method with Support Identification Guarantees for Structured Optimization accepted by AISTATS2023.
This repo contains implementations of a collection of stochastic first-order methods, including ProxSVRG, SAGA, RDA, PStorm, and S-PStorm. The repo contains two main directories, src and test. src/solvers contain the source code for all algorithm implementations. test directory contains the scripts necessary to reproduce the results reported in the paper.
Navigate to the test/data_prep directory and do the following steps.
- Run
bash download.shto download 10 datasets. - Run
bash process.shto perform data preprocessing. - Run
python compute_Lipto get the estimate of the Lipschitz constants.
- Navigate to the directory:
cd test/bash. - Generate the bash scripts:
python create_bash.py - Run the command
bash submitand experiments will run in the background. The logs on each run can be found attest/bash/logand the results will be saved attest/experiments