Skip to content
No description, website, or topics provided.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
conf
javagat
lib
lld-queries
queries
scripts
src/nl/vu/cs/querypie
.gitignore
LICENSE
README
build.xml

README

* First of all, be aware that this code is _highly experimental_ and
requires many steps to be launched. Unfortunately I have very little
time (or better no time) to improve the code with proper documentation
and/or examples. This means that your best reference is the code
itself.

* All this code is released under Apache v2. Please respect its terms if
  you use for your own purposes.

* to build querypie, you need ant. The easiest way, which packs (almost)
  everything into a single jar, is to run "ant build-jar".

* (Very minimal) guide to start the program:

- 1) you need to compress the triples using the webpie compression
  code. The code is available at http://github.com/jrbn/webpie class
  jobs.FilesImportTriples. See bit.ly/cFGtRQ for more info).

- 2) Create the six indices with the webpie code. (See link above for
  more info. The class is jobs.CreateIndex).

- 3) You must convert the hadoop files to normal files. The script
  scripts/convertHDFSFiles.sh should the job.

- 4) Launch an ibis server:
     cd scripts
     ./ibis &

- 5) Launch querypie with the class nl.vu.cs.querypie.QueryPIE.
  Please, look at the code to understand
  which parameters are used for what. (notice that the program
  requires an instance of an ibis server running. See bit.ly/111FvV5
  for more info). The ibis server has been started in step 4.
  A script "run-querypie" is provided in the scripts directory.
  It takes a single argument: the directory containing the 6 indices.

- 6) To launch the closure, use the script "compute-closure" in the
  "scripts" folder to launch the appropriate program.

- 7) If you want to query the knowledge base, then you should use the
  program nl.vu.cs.querypie.Query. This program requires you pass a
  query with the numeric IDs instead of the URIs. If you need to
  retrieve this numbers from the original dataset, there is an Hadoop
  program that does this job. It is jobs.ExtractNumberDictionary (in
  the webpie codebase). Alternatively, you can pass a file containing
  this mapping to querypie (see scripts/run-querypie on how to do that).
  If you do that, you can use URIs in the queries.
You can’t perform that action at this time.