The software used to extract structured data from Wikipedia
A repo that contains outgoing links from DBpedia
DBpedia support for multimedia data sources other than Wikipedia. GSoC 2014 project.
Various tools for the DBpedia project - This does NOT contain the DBpedia extaction framework
creates a docker image with Virtuoso preloaded with the latest DBpedia dataset
Core to generate a single json dump for a DBpedia release
The DBpedia Data ID Unit is a DBpedia Group with the goal of describing LOD datasets via RDF files, to host and deliver these metadata files together with the dataset in a uniform way, create and validate such files and deploy the results for the DBpedia and its local chapters.
a simple interface that displays the DBpedia community
The DBpedia DataID vocabulary is a metadata system for detailed descriptions of datasets and their physical instances, as well as their relation to agents like persons or organizations in regard to their rights and responsibilities.
DBpedia Historique Extractor
A light-weight tool to fuse complementary facts to DBpedia identifiers
Outputs a list of ranked DBpedia resources for a search string.
Extract Data from Wikipedia Lists
A repository for the "Combining DBpedia and Topic Modeling" GSoC 2016 idea
Repository for the DBpedia GSoC Hybrid Classifier/Rule-based Event Extractor Project
Extract Data from Wikipedia Tables
Tools & scripts to infer new Wikipedia infobox to ontology mappings
The service supporting DataID and its website
This repository will provide the necessary mappings between Cmd profiles (CLARIN) and DataID, as well as queries (like XSPARQL) to convert them directly.
DBpedia Distributed Extraction Framework: Extract structured data from Wikipedia in a parallel, distributed manner
moved to https://github.com/dbpedia/links
Fact Extraction from Wikipedia Text
The DBpedia AI project
DBpedia Association membership metadata
MAterial about DBpedia
Keeps a mirror of DBpedia live in sync
A webprotege deployment for editing the DBpedia ontology