This repository contains official implementation for paper Generate-on-Graph: Treat LLM as both Agent and KG in Incomplete Knowledge Graph Question Answering
.
In this documentation, we detail how to construct Incomplete KGs and run GoG.
See ./Freebase/README.md
This step can be skipped, as all processed data are contained in the /data.
python src/generate_samples_with_crucial_edges.py
Downloading the pickle file from Google Drive, and put it in Freebase/bm25.pkl.
Start the service with this command, and the default port is 18891.
python src/bm25_name2ids.py
python src/GoG.py --n_process=4 --dataset data/cwq/data_with_ct_1000_-1_1.json
If you want to access openai and huggingface with proxy, please uncomment lines and change the proxy address in run_llm
in src/llms/interface.py
and set_environment_variable
in src/utils.py
.