What is it about?

In this paper, we devise a pushdown automaton (PDA)-based methodology to make the first attempt to consider grammatical Seq2Seq models for general-purpose code generation, exploiting the principle that PL is a subset of PDA recognizable language and code accepted by PDA is grammatical. Specifically, we construct a PDA module and design an algorithm to constrain the generation of Seq2Seq models to ensure grammatical correctness. Guided by this methodology, we further propose CODEP, a code generation framework equipped with a PDA module, to integrate the deduction of PDA into deep learning. This framework leverages the state of PDA deduction (including state representation, state prediction task, and joint prediction with state) to assist models in learning PDA deduction.

Featured Image

Read the Original

This page is a summary of: CODEP: Grammatical Seq2Seq Model for General-Purpose Code Generation, July 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3597926.3598048.
You can read the full text:

Read

Contributors

The following have contributed to this page