This repository provides prompts, LLM results and code implementation for our paper Large Language Model Cascades with Mixture of Thoughts Representations for Cost-efficient Reasoning, which is accepted by ICLR'24!
Please cite our paper if you find our work/code helpful!
@article{Yue2023LargeLM,
title={Large Language Model Cascades with Mixture of Thoughts Representations for Cost-efficient Reasoning},
author={Murong Yue and Jie Zhao and Min Zhang and Liang Du and Ziyu Yao},
journal={ArXiv},
year={2023},
volume={abs/2310.03094},
url={https://api.semanticscholar.org/CorpusID:263671564}
}
pip install -r requirements.txt
All results with different temperatures are listed in the folder. All prompts are in Evaluation/prompting.
To evaluate the result of any setting, run the code
python Evaluation/${FILE}