Skip to content

epfl-dlab/GCD

Repository files navigation

Python 3.9 MIT License arXiv

Grammar-Constrained Decoding for Structured NLP Tasks without Finetuning


🌟 New Implementation Compatible with HuggingFace Transformers

We provide an implementation of GCD that is compatible with the popular Transformers library!

This new package, Transformers-CFG, extends the capabilities of our Grammar-Constrained Decoding (GCD) approach by integrating seamlessly with the Transformers library. It offers:

  • Easy Integration: Quickly combine the power of GCD with any model listed in the transformers library with just few lines of code!
  • Enhanced Performance: Leverage the GCD technique for more efficient and accurate generation.
  • Friendly Interface: Implemented with the EBNF grammar interface, making it accessible for both beginners and experts.

Get started with Transformers-CFG here.


1. The Overview of GCD

2. Environment Setup

With the repository cloned, we recommend creating a new conda virtual environment:

conda create -n GCD python=3.9
conda activate GCD

Install the required packages:

pip install -r requirements.txt

Experiments

Citation

This repository contains the code for the models and experiments in Grammar-Constrained Decoding for Structured NLP Tasks without Finetuning

@inproceedings{geng-etal-2023-grammar,
	title        = {Grammar-Constrained Decoding for Structured {NLP} Tasks without Finetuning},
	author       = {Geng, Saibo  and Josifoski, Martin  and Peyrard, Maxime  and West, Robert},
	year         = 2023,
	month        = dec,
	booktitle    = {Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing},
	publisher    = {Association for Computational Linguistics},
	address      = {Singapore},
	url          = {https://aclanthology.org/2023.emnlp-main.674},
	editor       = {Bouamor, Houda  and Pino, Juan  and Bali, Kalika}
}

Please consider citing our work, if you found the provided resources useful.

License

This project is licensed under the terms of the MIT license.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published