Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit 3de385c
Showing
84 changed files
with
6,077 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,119 @@ | ||
# Byte-compiled / optimized / DLL files | ||
__pycache__/ | ||
*.py[cod] | ||
*$py.class | ||
|
||
# C extensions | ||
*.so | ||
|
||
.DS_Store | ||
._.DS_Store | ||
docs/.DS_Store | ||
docs/._.DS_Store | ||
|
||
|
||
# Distribution / packaging | ||
.Python | ||
build/ | ||
develop-eggs/ | ||
dist/ | ||
downloads/ | ||
eggs/ | ||
.eggs/ | ||
lib/ | ||
lib64/ | ||
parts/ | ||
sdist/ | ||
var/ | ||
wheels/ | ||
*.egg-info/ | ||
.installed.cfg | ||
*.egg | ||
MANIFEST | ||
|
||
# PyInstaller | ||
# Usually these files are written by a python script from a template | ||
# before PyInstaller builds the exe, so as to inject date/other infos into it. | ||
*.manifest | ||
*.spec | ||
|
||
# Installer logs | ||
pip-log.txt | ||
pip-delete-this-directory.txt | ||
|
||
# Unit test / coverage reports | ||
htmlcov/ | ||
.tox/ | ||
.coverage | ||
.coverage.* | ||
.cache | ||
nosetests.xml | ||
coverage.xml | ||
*.cover | ||
.hypothesis/ | ||
|
||
# Translations | ||
*.mo | ||
*.pot | ||
|
||
# Django stuff: | ||
*.log | ||
.static_storage/ | ||
.media/ | ||
local_settings.py | ||
|
||
# Flask stuff: | ||
instance/ | ||
.webassets-cache | ||
|
||
# Scrapy stuff: | ||
.scrapy | ||
|
||
# Sphinx documentation | ||
docs/_build/ | ||
|
||
# PyBuilder | ||
target/ | ||
|
||
# Jupyter Notebook | ||
.ipynb_checkpoints | ||
|
||
# pyenv | ||
.python-version | ||
|
||
# celery beat schedule file | ||
celerybeat-schedule | ||
|
||
# SageMath parsed files | ||
*.sage.py | ||
|
||
# Environments | ||
.env | ||
.venv | ||
env/ | ||
venv/ | ||
ENV/ | ||
env.bak/ | ||
venv.bak/ | ||
|
||
# Spyder project settings | ||
.spyderproject | ||
.spyproject | ||
|
||
# Rope project settings | ||
.ropeproject | ||
|
||
# mkdocs documentation | ||
/site | ||
|
||
# mypy | ||
.mypy_cache/ | ||
|
||
# pycharm | ||
.idea/ | ||
|
||
#chan-hdt | ||
model | ||
.pytest_cache | ||
output.txt | ||
unittest_save_and_restore_models |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,142 @@ | ||
# Knowledge Graph Embedding Models | ||
|
||
|
||
## About | ||
|
||
Explainable Link Prediction (`ampligraph`) is a machine learning library for Relational Learning, a branch of machine learning | ||
that deals with supervised learning on knowledge graphs. | ||
|
||
The library includes Relational Learning models, i.e. supervised learning models designed to predict | ||
links in knowledge graphs. | ||
|
||
The tool also includes the required evaluation protocol, metrics, knowledge graph preprocessing, | ||
and negative statements generator strategies. | ||
|
||
|
||
# Installation | ||
|
||
### Provision a Virtual Environment | ||
|
||
**Installation using Anaconda is highly recommended.** | ||
|
||
Create & activate Virtual Environment (conda) | ||
|
||
``` | ||
conda create --name ampligraph python=3.6 | ||
source activate ampligraph | ||
``` | ||
|
||
### Install TensorFlow | ||
|
||
**CPU version** | ||
|
||
``` | ||
pip install tensorflow | ||
``` | ||
|
||
or you could install the version packaged with conda: | ||
|
||
``` | ||
conda install tensorflow | ||
``` | ||
|
||
**GPU version** | ||
|
||
``` | ||
pip install tensorflow-gpu | ||
``` | ||
|
||
or you could install the version packaged with conda: | ||
|
||
``` | ||
conda install tensorflow-gpu | ||
``` | ||
|
||
|
||
## Install the library | ||
|
||
|
||
You can install the latest stable release of `ampligraph` with pip, using the latest wheel (0.3.0) published by Dublin Labs: | ||
*Note this work only from within the Dublin Labs network* | ||
|
||
``` | ||
pip install http://dubaldeweb001.techlabs.accenture.com/wheels/ampligraph/ampligraph-0.3.dev0-py3-none-any.whl | ||
``` | ||
|
||
If instead you want the most recent development version, you can clone the repository | ||
and install from source (this will pull the latest commit on `develop` branch). | ||
The code snippet below will install the library in editable mode (`-e`): | ||
|
||
``` | ||
git clone ssh://git@innersource.accenture.com/dl/ampligraph.git | ||
cd ampligraph | ||
pip install -e . | ||
``` | ||
|
||
|
||
## Download the Datasets | ||
|
||
Datasets can be downloaded from [SharePoint](https://ts.accenture.com/sites/TechLabs-Dublin/_layouts/15/guestaccess.aspx?guestaccesstoken=Uz28P2m4hWp2TEgbvFrD%2b4BiURBHVTAw0NbPBRLzWWA%3d&folderid=2_012fd581718e74e4a9305c845a1224ee1&rev=1). | ||
Once downloaded, decompress the archives. | ||
|
||
**You must also set the following environment variable:** | ||
|
||
``` | ||
export AMPLIGRAPH_DATA_HOME=/YOUR/PATH/TO/datasets | ||
``` | ||
|
||
## Sanity Check | ||
|
||
```python | ||
>> import ampligraph | ||
>> ampligraph.__version__ | ||
'0.3-dev' | ||
``` | ||
|
||
## Installing with HDT Support | ||
[HDT](http://www.rdfhdt.org/) is a compressed type of RDF graph data. By default, the installed ampligraph library does not support loading this data type. To enable it, you must have **`gcc` with C++11 support** installed in your Linux box. | ||
|
||
**Ubuntu** | ||
|
||
``` | ||
sudo add-apt-repository ppa:jonathonf/gcc-7.3 | ||
sudo apt-get update | ||
sudo apt-get install gcc-7 | ||
``` | ||
|
||
**CentOS** | ||
|
||
Below are commands we used to install gcc 7.3.1 on CentOS 7.5: | ||
|
||
``` | ||
sudo yum install centos-release-scl | ||
sudo yum install devtoolset-7-gcc* | ||
scl enable devtoolset-7 bash | ||
``` | ||
|
||
Once finished installing gcc, you can install the ampligraph library with hdt support by: | ||
|
||
``` | ||
pip install .[hdt] | ||
``` | ||
|
||
## Documentation | ||
|
||
**[Latest documentation available here](http://10.106.43.211/docs/ampligraph/dev/index.html)** | ||
|
||
|
||
The project documentation can be built with Sphinx: | ||
|
||
``` | ||
cd docs | ||
make clean autogen html | ||
``` | ||
|
||
## Tests | ||
|
||
|
||
``` | ||
pytest -s tests | ||
``` | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
"""Explainable Link Prediction is a library for relational learning on knowledge graphs.""" | ||
|
||
__version__ = '0.3-dev' | ||
|
||
|
||
__all__ = ['datasets', 'latent_features', 'evaluation'] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
"""Helper functions to load knowledge graphs from disk.""" | ||
|
||
from .datasets import load_from_csv, load_from_rdf, load_fb15k, load_wn18, load_fb15k_237, load_from_ntriples\ | ||
, AMPLIGRAPH_DATA_HOME, load_from_hdt, load_ICEWS, \ | ||
load_wn11, load_fb13 | ||
|
||
__all__ = ['load_from_csv', 'load_from_rdf', 'load_from_ntriples', 'load_wn11', 'load_wn18', 'load_fb15k', | ||
'load_fb13', 'load_fb15k_237', 'load_from_hdt', | ||
'load_wn11', 'load_fb13', 'AMPLIGRAPH_DATA_HOME'] | ||
|
Oops, something went wrong.