Skip to content

AlongWY/sccdec

Repository files navigation

Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement

Code For Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement.

Deploy

vllm serve LLM4Binary/llm4decompile-6.7b-v1.5 --port 8000 --tensor-parallel-size 1 --enable-lora --lora-modules model=ylfeng/sccdec-lora
python src/sccdec/eval.py --base_url http://127.0.0.1:8000/v1 --model_name model --one_shot

Results

Model/Metrics Re-executability Rate (%) Readability (#)
O0O1O2O3AVG O0O1O2O3AVG
Rule Based Decompiler
ghidra 34.7616.4615.2414.0220.12 2.982.412.522.382.57
Refine-Based Method
GPT-4o 46.9534.1528.6631.1035.22 2.822.352.292.312.44
LLM4Decompile-Ref 74.3946.9547.5642.0752.74 4.083.383.343.193.50
End-to-End Method
LLM4Decompile-End 69.5144.5139.6338.4148.02 4.073.463.403.233.54
FAE Decompile 67.6848.7845.7342.0751.07 3.943.463.403.253.51
FAE Decompile+SCC 70.2448.5447.5643.2952.41 3.973.483.413.233.52
ReF Decompile 85.3756.1051.8352.4361.43 4.133.603.543.493.69

Resoureces

Reference

If you use SCCDEC in your work, please cite it as follows:

@inproceedings{feng-etal-2024-self,
    title = "Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement",
    author = "Feng, Yunlong  and
      Teng, Dechuan  and
      Xu, Yang  and
      Mu, Honglin  and
      Xu, Xiao  and
      Qin, Libo  and
      Zhu, Qingfu  and
      Che, Wanxiang",
    editor = "Al-Onaizan, Yaser  and
      Bansal, Mohit  and
      Chen, Yun-Nung",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2024",
    month = nov,
    year = "2024",
    address = "Miami, Florida, USA",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2024.findings-emnlp.385",
    pages = "6603--6614",
    abstract = "Decompilation transforms compiled code back into a high-level programming language for analysis when source code is unavailable. Previous work has primarily focused on enhancing decompilation performance by increasing the scale of model parameters or training data for pre-training. Based on the characteristics of the decompilation task, we propose two methods: (1) Without fine-tuning, the Self-Constructed Context Decompilation (sc$^2$dec) method recompiles the LLM{'}s decompilation results to construct pairs for in-context learning, helping the model improve decompilation performance. (2) Fine-grained Alignment Enhancement (FAE), which meticulously aligns assembly code with source code at the statement level by leveraging debugging information, is employed during the fine-tuning phase to achieve further improvements in decompilation. By integrating these two methods, we achieved a Re-Executability performance improvement of approximately 3.90{\%} on the Decompile-Eval benchmark, establishing a new state-of-the-art performance of 52.41{\%}. The code, data, and models are available at https://github.com/AlongWY/sccdec.",
}
  • License: MIT

About

Code For Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement.

Resources

License

Stars

Watchers

Forks

Languages