Skip to content

Commit

Permalink
update bibtex and paper links
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed Jul 16, 2020
1 parent d69a96e commit 69b87d8
Show file tree
Hide file tree
Showing 2 changed files with 21 additions and 17 deletions.
18 changes: 10 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@

**TextBrewer** is a PyTorch-based model distillation toolkit for natural language processing. It includes various distillation techniques from both NLP and CV field and provides an easy-to-use distillation framework, which allows users to quickly experiment with the state-of-the-art distillation methods to compress the model with a relatively small sacrifice in the performance, increasing the inference speed and reducing the memory usage.

Paper: [https://arxiv.org/abs/2002.12620](https://arxiv.org/abs/2002.12620)
Check our paper through [ACL Anthology](https://www.aclweb.org/anthology/2020.acl-demos.2/) or [arXiv pre-print](https://arxiv.org/abs/2002.12620).

[Full Documentation](https://textbrewer.readthedocs.io/)

Expand Down Expand Up @@ -402,14 +402,16 @@ We recommend that users use pre-trained student models whenever possible to full

## Citation

If you find TextBrewer is helpful, please cite [our paper](https://arxiv.org/abs/2002.12620):
```
If you find TextBrewer is helpful, please cite [our paper](https://www.aclweb.org/anthology/2020.acl-demos.2/):
```bibtex
@InProceedings{textbrewer-acl2020-demo,
author = "Yang, Ziqing and Cui, Yiming and Chen, Zhipeng and Che, Wanxiang and Liu, Ting and Wang, Shijin and Hu, Guoping",
title = "{T}ext{B}rewer: {A}n {O}pen-{S}ource {K}nowledge {D}istillation {T}oolkit for {N}atural {L}anguage {P}rocessing",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
year = "2020",
publisher = "Association for Computational Linguistics"
title = "{T}ext{B}rewer: {A}n {O}pen-{S}ource {K}nowledge {D}istillation {T}oolkit for {N}atural {L}anguage {P}rocessing",
author = "Yang, Ziqing and Cui, Yiming and Chen, Zhipeng and Che, Wanxiang and Liu, Ting and Wang, Shijin and Hu, Guoping",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
year = "2020",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-demos.2",
pages = "9--16",
}
```

Expand Down
20 changes: 11 additions & 9 deletions README_ZH.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
融合并改进了NLP和CV中的多种知识蒸馏技术,提供便捷快速的知识蒸馏框架,
用于以较低的性能损失压缩神经网络模型的大小,提升模型的推理速度,减少内存占用。

Paper: [https://arxiv.org/abs/2002.12620](https://arxiv.org/abs/2002.12620)
可以通过[ACL Anthology](https://www.aclweb.org/anthology/2020.acl-demos.2/)[arXiv pre-print](https://arxiv.org/abs/2002.12620)查看我们的论文。

[完整文档](https://textbrewer.readthedocs.io/)

Expand Down Expand Up @@ -391,17 +391,19 @@ Distiller负责执行实际的蒸馏过程。目前实现了以下的distillers:

## 引用

如果TextBrewer工具包对你的研究工作有所帮助,请在文献中引用下述[技术报告](https://arxiv.org/abs/2002.12620)
如果TextBrewer工具包对你的研究工作有所帮助,请在文献中引用我们的[论文](https://www.aclweb.org/anthology/2020.acl-demos.2/)

```
```bibtex
@InProceedings{textbrewer-acl2020-demo,
author = "Yang, Ziqing and Cui, Yiming and Chen, Zhipeng and Che, Wanxiang and Liu, Ting and Wang, Shijin and Hu, Guoping",
title = "{T}ext{B}rewer: {A}n {O}pen-{S}ource {K}nowledge {D}istillation {T}oolkit for {N}atural {L}anguage {P}rocessing",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
year = "2020",
publisher = "Association for Computational Linguistics"
title = "{T}ext{B}rewer: {A}n {O}pen-{S}ource {K}nowledge {D}istillation {T}oolkit for {N}atural {L}anguage {P}rocessing",
author = "Yang, Ziqing and Cui, Yiming and Chen, Zhipeng and Che, Wanxiang and Liu, Ting and Wang, Shijin and Hu, Guoping",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
year = "2020",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-demos.2",
pages = "9--16",
}
```
```

## 关注我们

Expand Down

0 comments on commit 69b87d8

Please sign in to comment.