Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.
/ Text-to-LogicForm Public archive

Text-to-LogicForm is a simple code for leveraging a syntactic graph for semantic parsing using a nov

License

Notifications You must be signed in to change notification settings

IBM/Text-to-LogicForm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Text-to-LogicForm

Text-to-LogicForm is a simple code for leveraging a syntactic graph for semantic parsing using a nov

Possibly Required Dependency and Tools

Text-to-LogicForm requires two parts of codes. The first part is a novel Graph2Seq model to perform the job. The codes can be found here: https://github.com/IBM/Graph2Seq. The second part is a pre-processing code for converting text to a syntactic graph (this part of codes will be releasing soon).

Graph2Seq

Graph2Seq is a simple code for building a graph-encoder and sequence-decoder for NLP and other AI/ML/DL tasks.

How To Run The Codes

To train your graph-to-sequence model, you need:

(1) Prepare your train/dev/test data which the form of:

each line is a json object whose keys are "seq", "g_ids", "g_id_features", "g_adj":
"seq" is a text which is supposed to be the output of the decoder
"g_ids" is a mapping from the node ID to its ID in the graph
"g_id_features" is a mapping from the node ID to its text features
"g_adj" is a mapping from the node ID to its adjacent nodes (represented as thier IDs)

See data/no_cycle/train.data as examples.

(2) Modify some hyper-parameters according to your task in the main/configure.py

(3) train the model by running the following code "python run_model.py train -sample_size_per_layer=xxx -sample_layer_size=yyy" The model that performs the best on the dev data will be saved in the dir "saved_model"

(4) test the model by running the following code "python run_model.py test -sample_size_per_layer=xxx -sample_layer_size=yyy" The prediction result will be saved in saved_model/prediction.txt

How To Cite The Codes

Please cite our work if you like or are using our codes for your projects!

Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, and Vadim Sheinin, "Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model", In 2018 Conference on Empirical Methods in Natural Language Processing.

@article{xu2018exploiting,
title={Exploiting rich syntactic information for semantic parsing with graph-to-sequence model},
author={Xu, Kun and Wu, Lingfei and Wang, Zhiguo and Yu, Mo and Chen, Liwei and Sheinin, Vadim},
journal={arXiv preprint arXiv:1808.07624},
year={2018}
}

Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock, and Vadim Sheinin (first and second authors contributed equally), "Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks", arXiv preprint arXiv:1804.00823.

@article{xu2018graph2seq,
title={Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks},
author={Xu, Kun and Wu, Lingfei and Wang, Zhiguo and Feng, Yansong and Witbrock, Michael and Sheinin, Vadim},
journal={arXiv preprint arXiv:1804.00823},
year={2018}
}


Contributors: Kun Xu, Lingfei Wu
Created date: November 19, 2018
Last update: November 19, 2018

About

Text-to-LogicForm is a simple code for leveraging a syntactic graph for semantic parsing using a nov

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages