PulseAugur
LIVE 09:06:20
research · [1 source] ·
0
research

Language models improve static program slicing with dataflow pretraining

Researchers have developed Sliceformer, a new method for static program slicing that uses language models to identify relevant code segments. This approach addresses limitations in existing learning-based techniques by improving dependency modeling through dataflow-aware pretraining and preventing inaccurate outputs with constrained decoding. Evaluations on Java and Python benchmarks show Sliceformer significantly outperforms current methods, achieving up to a 22% increase in ExactMatch accuracy. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Improves automated code analysis accuracy, potentially aiding developers in debugging and understanding complex codebases.

RANK_REASON Academic paper introducing a novel method for static program slicing.

Read on arXiv cs.AI →

COVERAGE [1]

  1. arXiv cs.AI TIER_1 · Pengfei He (Peter), Shaowei Wang (Peter), Tse-Hsun (Peter), Chen, Muhammad Asaduzzaman ·

    Static Program Slicing Using Language Models With Dataflow-Aware Pretraining and Constrained Decoding

    arXiv:2604.26961v1 Announce Type: cross Abstract: Static program slicing is a fundamental software engineering technique for isolating code relevant to specific variables. While recent learning-based approaches using language models (LMs) show promise in automating slice predicti…