Skip to content
This repository has been archived by the owner on Apr 9, 2022. It is now read-only.

allenai/allennlp-semparse

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

Bumps [mypy](https://github.com/python/mypy) from 0.931 to 0.942.
- [Release notes](https://github.com/python/mypy/releases)
- [Commits](python/mypy@v0.931...v0.942)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
751c25f

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
February 7, 2022 12:52
August 20, 2020 15:18
October 13, 2019 09:37
August 10, 2021 09:59
October 11, 2019 08:26
August 27, 2019 15:50
August 20, 2020 15:48
October 13, 2019 09:41
October 11, 2019 07:12
August 28, 2019 12:28
August 20, 2020 15:18

allennlp-semparse

Build status PyPI codecov

A framework for building semantic parsers (including neural module networks) with AllenNLP, built by the authors of AllenNLP

Installing

allennlp-semparse is available on PyPI. You can install through pip with

pip install allennlp-semparse

Supported datasets

  • ATIS
  • Text2SQL
  • NLVR
  • WikiTableQuestions

Supported models

  • Grammar-based decoding models, following the parser originally introduced in Neural Semantic Parsing with Type Constraints for Semi-Structured Tables. The models that are currently checked in are all based on this parser, applied to various datasets.
  • Neural module networks. We don't have models checked in for this yet, but DomainLanguage supports defining them, and we will add some models to the repo once papers go through peer review. The code is slow (batching is hard), but it works.

Tutorials

Coming sometime in the future... You can look at this old tutorial, but the part about using NLTK to define a grammar is outdated. Now you can use DomainLanguage to define a python executor, and we analyze the type annotations in the functions in that executor to automatically infer a grammar for you. It is much easier to use than it used to be. Until we get around to writing a better tutorial for this, the best way to get started using this is to look at some examples. The simplest is the Arithmetic language in the DomainLanguage test (there's also a bit of description in the DomainLanguage docstring). After looking at those, you can look at more complex (real) examples in the domain_languages module. Note that the executor you define can have learned parameters, making it a neural module network. The best place to get an example of that is currently this unfinished implementation of N2NMNs on the CLEVR dataset. We'll have more examples of doing this in the not-too-distant future.

About

A framework for building semantic parsers (including neural module networks) with AllenNLP, built by the authors of AllenNLP

Topics

Resources

License

Stars

Watchers

Forks

Languages