Logo VDebugger

Harnessing Execution Feedback for Debugging Visual Programs

University of California Los Angeles
Overview of VDebugger.

Visual programs are executable code generated by large language models to address visual reasoning problems. They decompose complex questions into multiple reasoning steps and invoke specialized models for each step to solve the problems.

However, these programs are prone to logic errors, with our preliminary evaluation showing that 58% of the total errors are caused by program logic errors. Debugging complex visual programs remains a major bottleneck for visual reasoning.

To address this, we introduce VDebugger, a novel critic-refiner framework trained to localize and debug visual programs by tracking execution step by step. VDebugger identifies and corrects program errors leveraging detailed execution feedback, improving interpretability and accuracy. The training data is generated through an automated pipeline that injects errors into correct visual programs using a novel mask-best decoding technique.

Evaluations on six datasets demonstrate VDebugger's effectiveness, showing performance improvements of up to 3.2% in downstream task accuracy. Further studies show VDebugger's ability to generalize to unseen tasks, bringing a notable improvement of 2.3% on the unseen COVR task.

Comparison against existing work.

Comparison against existing work.

Results

Qualitative Analysis

BibTeX


@inproceedings{wu-etal-2024-vdebugger,
    title = "{VD}ebugger: Harnessing Execution Feedback for Debugging Visual Programs",
    author = "Wu, Xueqing  and
      Lin, Zongyu  and
      Zhao, Songyan  and
      Wu, Te-Lin  and
      Lu, Pan  and
      Peng, Nanyun  and
      Chang, Kai-Wei",
    editor = "Al-Onaizan, Yaser  and
      Bansal, Mohit  and
      Chen, Yun-Nung",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2024",
    month = nov,
    year = "2024",
    address = "Miami, Florida, USA",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2024.findings-emnlp.575",
    doi = "10.18653/v1/2024.findings-emnlp.575",
    pages = "9845--9860"
}