Skip to content
master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 

README.md

Hadoop WebGraph InputFormat

This input format enables the use of graphs in the WebGraph format (BVGraph) on clusters running Hadoop or Spark. Graphs must be provided as required by WebGraph in the form of three files stored on HDFS: basename.graph, basename.offsets, basename.properties. A nice collection of such graphs is available for download on http://law.di.unimi.it/datasets.php.

The input format loads the nodes of a graph in parallel, distributed according to the number of splits specified (default: 100). Each node is loaded by its ID (key) and an array of successor IDs (value), i.e., its neighbors connected by outgoing edges.

Loading a WebGraph with Spark using this input format works as follows:

import de.l3s.mapreduce.webgraph.io._

WebGraphInputFormat.setBasename(sc.hadoopConfiguration, "/hdfs/path/to/webgraph/basename")
WebGraphInputFormat.setNumberOfSplits(sc.hadoopConfiguration, 100)

val rdd = sc.newAPIHadoopRDD(sc.hadoopConfiguration, classOf[WebGraphInputFormat], classOf[IntWritable], classOf[IntArrayWritable])

To transform this into an RDD of tuples in the form of (ID, successor IDs) run:

val adjacencyList = rdd.map{case (id, out) => (id.get, out.values)}

The following code counts the number of edges in the graph:

rdd.map{case (id, out) => out.values.size}.reduce(_ + _)

GraphX

Once loaded in Spark, a graph can also be used with GraphX, Spark's graph framework:

import org.apache.spark.graphx._

val edges = rdd.flatMap{case (id, out) => out.values.map(outId => (id.get.toLong, outId.toLong))}
val graph = Graph.fromEdgeTuples(edges, true)

Now, edges and nodes can be counted as follows:

graph.numVertices
graph.numEdges

Build

To build a JAR file that can be added to your classpath, simple use MVN:

git clone https://github.com/helgeho/HadoopWebGraph.git
cd HadoopWebGraph
mvn package

License

This project has been published under GPL 3.0, since this license is used by WebGraph, of which parts of the code are reused here.

About

A Hadoop input format to use gaphs in WebGraph's BV format with Hadoop and Spark.

Resources

License

Releases

No releases published

Languages

You can’t perform that action at this time.