Skip to content

shuzhenggao/sbllm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SBLLM

This is the repo of our ICSE'25 paper: "Search-Based LLMs for Code Optimization".

Dependency

Python == 3.9.12

C++17

GCC 9.4.0

Linux

Run the following command in the root directory of this repo:

pip install -r requirements.txt

Usage

  1. Download the processed dataset and test cases based on the instructions in the processed_data/ folder.

  2. Our code relies on the service of OpenAI (for ChatGPT, GPT-4), Google (for Gemini), and DeepInfra (for CodeLLaMa), so you need first obtain their API keys and fill them in the baselines/baselines.py and sbllm/evol_query.py

  3. SBLLM acquires the initialization results based on the COT prompt, so you need first obtain the results of COT prompt by

cd baselines  
bash cot.sh

You can change the model name in the cot.sh to experiment on different models (i.e., chatgpt, gpt4, gemini, codellama)

  1. Get the initailization solutions for SBLLM by processing the predictions of COT
cd sbllm   
python initial.py --model_name model_name --lang lang
  1. Modify the tree-sitter file path TREE_SITTER_DIR in sbllm/merge.py and run SBLLM with command
bash run.sh/run.sh

SBLLM will then use default settings to optimize the code in the test set.

The default setting is set to ns=3 and iteration=4. This setting is consistent with the paper.

Data

Please follow the instructions in the processed_data/ folder to download the dataset for experiments.

Baselines

The source code of other baseline methods is in the baselines/ folder.

For direct instruction:

cd baselines  
bash direct.sh

For in-context learning:

cd baselines  
bash icl.sh

For retrieval-augment generation:

cd baselines  
bash rag.sh

For chain-of-thought prompt:

cd baselines  
bash cot.sh

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors