Skip to content

init0xyz/AdaRewriter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

AdaRewriter: Unleashing the Power of Prompting-based Conversational Query Reformulation via Test-Time Adaptation (EMNLP 2025)

License: MIT Arxiv

πŸ’» Codes

Prerequisite

  • Before running the scripts, make sure to update all corresponding file paths in the provided scripts(i.e., ./scripts).
  • For open-source LLMs, we use vLLM with an OpenAI-compatible API server for local deployment. You can launch it as follows:
vllm serve Meta-Llama-3.1-8B-Instruct --dtype auto --api-key ollama --max-model-len 16000 --port 8000 --tensor_parallel_size 1 --enable-prefix-caching

Candidates Construction

Run the following scripts to generate candidate query reformulations for training and testing:

bash src/LLM4CS/run_train.sh ## For training candidates
bash src/LLM4CS/run_test.sh ## For testing candidates

Then, obtain the sparse and dense retrieval scores using:

bash scripts/obtain_ranking_bm25.sh ## For sparse scores
bash scripts/obtain_ranking_ance.sh ## For dense scores

After retrieving both sparse and dense scores, run the following script to compute the fusion scores:

python src/processing_rank.py

Model Training

After obtain reformulation candidates, train the model using:

python scripts/train.sh

Inference

To generate predictions for the test sets (e.g., TopiOCQA, QReCC...), run:

bash src/LLM4CS/run_test.sh

Then, use the reward model to obtain the final predictions:

bash scripts/test.sh

We provide the following scripts to evaluate the retrieval performance:

bash scripts/run_dense_search_direct.sh ## For TopiOCQA, QReCC, CAsT 19, CAsT 20
bash scripts/run_dense_search_direct_cast21.sh ## For CAsT 21

πŸ“– Citation

@article{adarewriter,
  title={AdaRewriter: Unleashing the Power of Prompting-based Conversational Query Reformulation via Test-Time Adaptation},
  author={Lai, Yilong and Wu, Jialong and Wang, Zhenglin and Zhou, Deyu},
  journal={arXiv preprint arXiv:2506.01381},
  year={2025}
}

πŸ™ Acknowledgment

We are very grateful to leverage prior works & source code to build this work, which includes ConvGQR, InfoCQR, cs-shortcut, LLM4CS, BRIO.

If you have any questions about AdaRewriter, feel free to contact me by yilong.lai@seu.edu.cn.

About

Codebase for AdaRewriter(EMNLP 2025)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published