Code for Does Structure Matter? Leveraging Data-to-Text Generation for Answering Complex Information Needs
Transformers implementation is based on Huggingface
TREC CAR dataset is used: benchmarkY1test for testing and Large-scale training data for training. Download from : here
Various adaptations are created using data_construction/make_data.py file. The data construction uses TREC CAR TOOLS (must be cloned/copied in data-contruction folder)
cogecsea folder contains implementation of: a finetuned T5 (from huggingface), a sequential planning-model based on T5, and an end-to-end planning model using T5.