Skip to content

Commit

Permalink
Code clean up.
Browse files Browse the repository at this point in the history
  • Loading branch information
danielkorzekwa committed Oct 30, 2015
1 parent d61c59f commit 02097f3
Show file tree
Hide file tree
Showing 68 changed files with 265 additions and 207 deletions.
10 changes: 8 additions & 2 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,16 @@ lazy val root = (project in file(".")).
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-math3" % "3.3",
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2",
"org.scalanlp" %% "breeze" % "0.11.2",
"org.scalanlp" %% "breeze-natives" % "0.11.2",
"org.scalanlp" %% "breeze" % "0.12-SNAPSHOT",
// test scoped
"org.slf4j" % "slf4j-log4j12" % "1.7.2" % Test,
"com.novocode" % "junit-interface" % "0.11" % Test
),

resolvers ++= Seq(
// other resolvers here
// if you want to use snapshot builds (currently 0.12-SNAPSHOT), use this.
"Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/",
"Sonatype Releases" at "https://oss.sonatype.org/content/repositories/releases/"
)
)
29 changes: 18 additions & 11 deletions doc/lowlevel/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,18 +32,25 @@ None
* Continuous Bayesian Networks
* [Linear Gaussian Models - 1D localisation, 4 approaches: Canonical Gaussian, Bayes's theorem for Gaussian Variables, Expectation Propagation and Kalman Filter](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/localisation_example/localisation_example.md)
* Hybrid Bayesian Networks
* [Gaussian approximation with moment matching, aka proj() operator in Expectation Propagation](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/moment_matching/moment_matching.md)
* [Expectation Propagation for the Clutter Problem](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/clutter_problem_ep/clutter_problem_ep.md)
* [TrueSkill - Updating player skills in tennis with Expectation Propagation inference algorithm](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/trueskill_in_tennis/trueskill_in_tennis.md)
* [TrueSkill on a factor graph in Tennis] (https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/trueskill_in_tennis_factor_graph/trueskill_in_tennis_factor_graph.md)
* [TrueSkill on a factor graph in Tennis (Dynamic Bayesian Network for 3 players with 6 games over 3 time slices)] (https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/trueskill_in_tennis_factor_graph_dbn/trueskill_in_tennis_factor_graph_dbn.md)
* EP
* [Gaussian approximation with moment matching, aka proj() operator in Expectation Propagation](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/moment_matching/moment_matching.md)
* [Expectation Propagation for the Clutter Problem](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/clutter_problem_ep/clutter_problem_ep.md)
* TrueSkill
* [TrueSkill - Updating player skills in tennis with Expectation Propagation inference algorithm](https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/trueskill_in_tennis/trueskill_in_tennis.md)
* [TrueSkill on a factor graph in Tennis] (https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/trueskill_in_tennis_factor_graph/trueskill_in_tennis_factor_graph.md)
* [TrueSkill on a factor graph in Tennis (Dynamic Bayesian Network for 3 players with 6 games over 3 time slices)] (https://github.com/danielkorzekwa/bayes-scala/blob/master/doc/trueskill_in_tennis_factor_graph_dbn/trueskill_in_tennis_factor_graph_dbn.md)
* Code examples only
* [Linear Dynamical Systems M-step (prior mean, emission variance, transition variance)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/learn/lds/GenericLDSMStepTest.scala)
* [Linear Dynamical Systems EM (learning prior mean and emission variance only from multiple data sequences)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/learn/lds/GenericLDSEMTest.scala)
* [Inference in Gaussian Process regression](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/infer/gp/gpr/GenericGPRegressionTest.scala)
* [Inference in Gaussian Process regression with inducing variables - link to bayes-scala-gp project](https://github.com/danielkorzekwa/bayes-scala-gp/blob/master/src/test/scala/dk/gp/sgpr/sgprPredictTest.scala)
* [Parameters learning in Gaussian Process regression with inducing variables (variational lower bound on marginal likelihood) - link to bayes-scala-gp project] (https://github.com/danielkorzekwa/bayes-scala-gp/blob/master/src/test/scala/dk/gp/sgpr/sgprTrainTest.scala)
* [Parameters learning in Gaussian Process regression (marginal likelihood maximisation)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/infer/gp/gpr/GpmlRegressionLearnTest.scala)
* Linear Dynamical Systems
* [Linear Dynamical Systems M-step (prior mean, emission variance, transition variance)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/learn/lds/GenericLDSMStepTest.scala)
* [Linear Dynamical Systems EM (learning prior mean and emission variance only from multiple data sequences)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/learn/lds/GenericLDSEMTest.scala)
* Gaussian Process Regression
* [Inference in Gaussian Process regression](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/infer/gp/gpr/GenericGPRegressionTest.scala)
* [Parameters learning in Gaussian Process regression (marginal likelihood maximisation)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/infer/gp/gpr/GpmlRegressionLearnTest.scala)
* Sparse Gaussian Process Regression
* [Inference in Gaussian Process regression with inducing variables - link to bayes-scala-gp project](https://github.com/danielkorzekwa/bayes-scala-gp/blob/master/src/test/scala/dk/gp/sgpr/sgprPredictTest.scala)
* [Parameters learning in Gaussian Process regression with inducing variables (variational lower bound on marginal likelihood) - link to bayes-scala-gp project] (https://github.com/danielkorzekwa/bayes-scala-gp/blob/master/src/test/scala/dk/gp/sgpr/sgprTrainTest.scala)
*

* [Parameters learning in Gaussian Process regression (variational lower bound on marginal likelihood)](https://github.com/danielkorzekwa/bayes-scala/blob/master/src/test/scala/dk/bayes/infer/gp/infercovparamsem/inferCovParamsEmTest.scala)
* Others
Expand Down
2 changes: 1 addition & 1 deletion project/build.properties
Original file line number Diff line number Diff line change
@@ -1 +1 @@
sbt.version=0.13.5
sbt.version=0.13.7
6 changes: 3 additions & 3 deletions src/main/scala/dk/bayes/dsl/Variable.scala
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,11 @@ trait Variable {

init()

private def init() {
private def init() = {
getParents().foreach(p => p.addChild(this))
}

def addChild(v: Variable) {
def addChild(v: Variable) = {
children += v
}
def getChildren(): Seq[Variable] = children.toList
Expand All @@ -40,7 +40,7 @@ trait Variable {
def getAllVariables(): Seq[Variable] = {
val variables = new HashSet[Variable]()

def addVariable(v: Variable) {
def addVariable(v: Variable):Unit = {

if (!variables.contains(v)) {
variables += v
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@ object inferMultivariateGaussianSimplest extends InferEngine[MultivariateGaussia

(child.getParents().size == 1 && child.getParents()(0).eq(x)) &&
!child.hasChildren &&
child.b.size == x.m.size &&
child.yValue.isDefined

}
Expand Down
4 changes: 2 additions & 2 deletions src/main/scala/dk/bayes/infer/ClusterGraphInfer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ trait ClusterGraphInfer {
* @param messageOrder Order of clusters in which messages are sent for a single iteration of Belief Propagation
*
*/
def calibrate(iterNum: (Int) => Unit, messageOrder: MessageOrder)
def calibrate(iterNum: (Int) => Unit, messageOrder: MessageOrder):Unit

/**
* Applies evidence and calibrates cluster graph.
Expand Down Expand Up @@ -55,5 +55,5 @@ trait ClusterGraphInfer {
*
* @param evidence Tuple2[variableId, variable value]
*/
def setEvidence(evidence: Tuple2[Int, Int])
def setEvidence(evidence: Tuple2[Int, Int]):Unit
}
6 changes: 3 additions & 3 deletions src/main/scala/dk/bayes/infer/LoopyBP.scala
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ import LoopyBP._
*/
case class LoopyBP(clusterGraph: ClusterGraph, threshold: Double = 0.00001) extends ClusterGraphInfer {

def calibrate(iterNum: (Int) => Unit = (iterNum: Int) => {}, messageOrder: MessageOrder = ForwardBackwardMsgOrder()) {
def calibrate(iterNum: (Int) => Unit = (iterNum: Int) => {}, messageOrder: MessageOrder = ForwardBackwardMsgOrder()):Unit = {

@tailrec
def calibrateUntilConverge(currentIter: Int): ClusterGraph = {
Expand Down Expand Up @@ -58,7 +58,7 @@ case class LoopyBP(clusterGraph: ClusterGraph, threshold: Double = 0.00001) exte
evidenceLogLikelihood
}

private def calibrateCluster(cluster: Cluster) {
private def calibrateCluster(cluster: Cluster):Unit = {

cluster.getEdges().foreach { edge =>

Expand Down Expand Up @@ -137,7 +137,7 @@ case class LoopyBP(clusterGraph: ClusterGraph, threshold: Double = 0.00001) exte
marginalFactor
}

def setEvidence(evidence: Tuple2[Int, Int]) {
def setEvidence(evidence: Tuple2[Int, Int]):Unit = {

for (cluster <- clusterGraph.getClusters()) {

Expand Down
2 changes: 1 addition & 1 deletion src/main/scala/dk/bayes/infer/ep/EP.scala
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ trait EP {
* @param varId Variable id
* @param varValue Variable value
*/
def setEvidence(varId: Int, varValue: AnyVal)
def setEvidence(varId: Int, varValue: AnyVal):Unit

/**
* Returns marginal factor for a given variable(s) in a factor graph.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl
* Executes a single message passing routine on a factor graph.
*
*/
private def calibrateIteration(nodes: Seq[Node], newMsgIndex: () => Long) {
private def calibrateIteration(nodes: Seq[Node], newMsgIndex: () => Long):Unit = {

val nodesNum = nodes.size

Expand All @@ -80,7 +80,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl
}
}

private def sendFactorMessage(factorNode: FactorNode, newMsgIndex: () => Long) {
private def sendFactorMessage(factorNode: FactorNode, newMsgIndex: () => Long):Unit = {

factorNode match {
case factorNode: SingleFactorNode => sendFactorMessage(factorNode, newMsgIndex)
Expand All @@ -91,7 +91,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl

}

private def sendFactorMessage(factorNode: SingleFactorNode, newMsgIndex: () => Long) {
private def sendFactorMessage(factorNode: SingleFactorNode, newMsgIndex: () => Long):Unit = {
val newMessage = factorNode.getFactor().asInstanceOf[SingleFactor]

factorNode.gate.setMessage(newMessage, newMsgIndex())
Expand All @@ -100,7 +100,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl

}

private def sendFactorMessage(factorNode: DoubleFactorNode, newMsgIndex: () => Long) {
private def sendFactorMessage(factorNode: DoubleFactorNode, newMsgIndex: () => Long):Unit = {

val gate1MsgIn = factorNode.gate1.getEndGate.getMessage
val gate2MsgIn = factorNode.gate2.getEndGate.getMessage
Expand All @@ -116,7 +116,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl

}

private def sendFactorMessage(factorNode: TripleFactorNode, newMsgIndex: () => Long) {
private def sendFactorMessage(factorNode: TripleFactorNode, newMsgIndex: () => Long):Unit = {

val gate1MsgIn = factorNode.gate1.getEndGate.getMessage
val gate2MsgIn = factorNode.gate2.getEndGate.getMessage
Expand All @@ -136,7 +136,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl

}

private def sendFactorMessage(factorNode: GenericFactorNode, newMsgIndex: () => Long) {
private def sendFactorMessage(factorNode: GenericFactorNode, newMsgIndex: () => Long):Unit = {
val msgsIn = factorNode.gates.map(g => g.getEndGate.getMessage())
val factor = factorNode.getFactor().asInstanceOf[GenericFactor]

Expand All @@ -152,7 +152,7 @@ case class ForwardBackwardEPCalibrate(factorGraph: FactorGraph, threshold: Doubl
}

/**Returns the number of messages sent.*/
private def sendVariableMessage(varNode: VarNode, newMsgIndex: () => Long) {
private def sendVariableMessage(varNode: VarNode, newMsgIndex: () => Long):Unit = {

var marginalFactor = varNode.getGates()(0).getEndGate.getMessage()
var i = 1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,10 +28,10 @@ case class EPNaiveBayesFactorGraph[X](prior: SingleFactor[X], likelihoods: Seq[D

def getPosterior(): X = posterior

def calibrate(maxIter: Int = 100, threshold: Double = 1e-6) {
def calibrate(maxIter: Int = 100, threshold: Double = 1e-6):Unit = {

@tailrec
def calibrateIter(currPosterior: X, iterNum: Int) {
def calibrateIter(currPosterior: X, iterNum: Int):Unit = {
if (iterNum >= maxIter) {
logger.warn(s"Factor graph did not converge in less than ${maxIter} iterations. Prior=%s, Posterior=%s".format(prior, posterior))
return
Expand All @@ -45,7 +45,7 @@ case class EPNaiveBayesFactorGraph[X](prior: SingleFactor[X], likelihoods: Seq[D
calibrateIter(posterior, 1)
}

private def sendMsgsParallel() {
private def sendMsgsParallel():Unit = {

msgsUp = msgsUp.zip(likelihoods).map {
case (currMsgUp, llh) =>
Expand All @@ -61,7 +61,7 @@ case class EPNaiveBayesFactorGraph[X](prior: SingleFactor[X], likelihoods: Seq[D
posterior = multOp(prior.factorMsgDown, multOp(msgsUp: _*))
}

private def sendMsgsSerial() {
private def sendMsgsSerial():Unit = {

msgsUp = msgsUp.zip(likelihoods).map {
case (currMsgUp, llh) =>
Expand Down
2 changes: 1 addition & 1 deletion src/main/scala/dk/bayes/learn/em/EMLearn.scala
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ trait EMLearn {
*
* @param progress Progress monitoring. It is called by this method at the end of every iteration
*/
def learn(clusterGraph: ClusterGraph, trainSet: DataSet, maxIterNum: Int, progress: (Progress) => Unit)
def learn(clusterGraph: ClusterGraph, trainSet: DataSet, maxIterNum: Int, progress: (Progress) => Unit):Unit
}

object EMLearn {
Expand Down
2 changes: 1 addition & 1 deletion src/main/scala/dk/bayes/learn/em/GenericEMLearn.scala
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ object GenericEMLearn extends EMLearn {

}

private def updateInitialClusterPotentials(clusterGraph: ClusterGraph, clusterPotentialsByTypeId: Map[Int, Factor]) {
private def updateInitialClusterPotentials(clusterGraph: ClusterGraph, clusterPotentialsByTypeId: Map[Int, Factor]):Unit = {
for (cluster <- clusterGraph.getClusters()) {
val clusterTypePotentials = clusterPotentialsByTypeId(cluster.typeId)
val newClusterPotentials = cluster.getFactor().copy(clusterTypePotentials.getValues())
Expand Down
8 changes: 4 additions & 4 deletions src/main/scala/dk/bayes/model/clustergraph/Cluster.scala
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
package dk.bayes.model.clustergraph

import factor._
import factor.Factor

/**
* Represents cluster in a cluster graph.
Expand All @@ -20,19 +20,19 @@ class Cluster(val id: Int, val typeId: Int, factor: Factor) {
private var _factor: Factor = factor
private var edges: List[Edge] = List()

def addEdge(edge: Edge) { edges = edge :: edges }
def addEdge(edge: Edge):Unit = { edges = edge :: edges }

def getEdges(): Seq[Edge] = edges

def getFactor(): Factor = _factor

def updateFactor(newFactor: Factor) {
def updateFactor(newFactor: Factor):Unit = {
_factor = newFactor

resetMessages()
}

def resetMessages() {
def resetMessages():Unit = {
edges.foreach { edge => edge.resetMessage() }
}
}
Expand Down
6 changes: 3 additions & 3 deletions src/main/scala/dk/bayes/model/clustergraph/ClusterGraph.scala
Original file line number Diff line number Diff line change
Expand Up @@ -22,20 +22,20 @@ trait ClusterGraph {
*
* @param factor Initial cluster potentials
*/
def addCluster(clusterId: Int, factor: Factor, clusterTypeId: Option[Int]=None)
def addCluster(clusterId: Int, factor: Factor, clusterTypeId: Option[Int]=None):Unit

/**
* Adds edge between clusters in this cluster graph.
*/
def addEdge(clusterId1: Int, clusterId2: Int)
def addEdge(clusterId1: Int, clusterId2: Int):Unit

/**
* Adds edges between clusters in this cluster graph.
*
* @param firstEdge Tuple2[clusterId, clusterId2]
* @param nextEdges Tuple2[clusterId, clusterId2]
*/
def addEdges(firstEdge: Tuple2[Int, Int], nextEdges: Tuple2[Int, Int]*)
def addEdges(firstEdge: Tuple2[Int, Int], nextEdges: Tuple2[Int, Int]*):Unit

/**
* Returns all clusters in this cluster graph.
Expand Down
6 changes: 3 additions & 3 deletions src/main/scala/dk/bayes/model/clustergraph/Edge.scala
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,17 @@ class Edge(val destClusterId: Int, val sepsetVariable: Var) {
private var newMessage: SingleFactor = SingleFactor(sepsetVariable, Array.fill(sepsetVariable.dim)(1d))
private var oldMessage: SingleFactor = newMessage

def setIncomingEdge(edge: Edge) {
def setIncomingEdge(edge: Edge):Unit = {
incomingEdge = Some(edge)
}
def getIncomingEdge(): Option[Edge] = incomingEdge

def resetMessage() {
def resetMessage():Unit = {
newMessage = SingleFactor(sepsetVariable, Array.fill(sepsetVariable.dim)(1d))
oldMessage = newMessage
}

def updateMessage(message: SingleFactor) {
def updateMessage(message: SingleFactor):Unit = {
oldMessage = newMessage
newMessage = message
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ case class GenericClusterGraph() extends ClusterGraph {

def getCluster(clusterId: Int): Cluster = clusters.find(c => c.id == clusterId).get

def addEdge(clusterId1: Int, clusterId2: Int) {
def addEdge(clusterId1: Int, clusterId2: Int):Unit = {
val cluster1 = clusters.find(c => c.id == clusterId1).get
val cluster2 = clusters.find(c => c.id == clusterId2).get
val sepsetVariable = calcSepsetVariable(cluster1, cluster2)
Expand All @@ -42,7 +42,7 @@ case class GenericClusterGraph() extends ClusterGraph {
cluster2.addEdge(edge21)
}

def addEdges(firstEdge: Tuple2[Int, Int], nextEdges: Tuple2[Int, Int]*) {
def addEdges(firstEdge: Tuple2[Int, Int], nextEdges: Tuple2[Int, Int]*):Unit = {
addEdge(firstEdge._1, firstEdge._2)
nextEdges.foreach(e => addEdge(e._1, e._2))
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ class MultiFactor(variables: Array[Var], values: Array[Double]) extends Factor {
* @param stepSize Number of steps before reaching next assignment for a given dimension
* @param process (valueIndex, variableIndex) => Unit
*/
private def processValues(dim: Int, stepSize: Int, process: (Int, Int) => Unit) {
private def processValues(dim: Int, stepSize: Int, process: (Int, Int) => Unit):Unit = {

var i = 0
var varIndex = 0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ trait FactorGraph {
/**
* Adds factor to the factor graph.
*/
def addFactor(factor: Factor)
def addFactor(factor: Factor):Unit

/**
* Returns all nodes (factors and variables) in a graph.
Expand Down
4 changes: 2 additions & 2 deletions src/main/scala/dk/bayes/model/factorgraph/Gate.scala
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ sealed abstract class Gate(initialMsg: SingleFactor) {
/**Allows for comparing the age between different messages and finding the message that was updated least recently.*/
private var msgIndex: Long = -1

def setEndGate(gate: END_GATE) { endGate = Some(gate) }
def setEndGate(gate: END_GATE):Unit = { endGate = Some(gate) }
def getEndGate(): END_GATE = endGate.get

def setMessage(newMessage: SingleFactor, msgIndex: Long) {
def setMessage(newMessage: SingleFactor, msgIndex: Long):Unit = {
oldMessage = message
message = newMessage

Expand Down
2 changes: 1 addition & 1 deletion src/main/scala/dk/bayes/model/factorgraph/Node.scala
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ sealed abstract class FactorNode(factor: Factor) extends Node {
private var _factor: Factor = factor

def getFactor(): Factor = _factor
def setFactor(factor: Factor) { _factor = factor }
def setFactor(factor: Factor):Unit = { _factor = factor }

/**
* Returns the product of the factor and all incoming messages.
Expand Down
2 changes: 1 addition & 1 deletion src/test/scala/dk/bayes/dsl/demo/ClutterProblemTest.scala
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import dk.bayes.dsl.infer
*/
class ClutterProblemTest {

@Test def test {
@Test def test:Unit = {

val x = Gaussian(15, 100)
val y1 = ClutteredGaussian(x, w = 0.4, a = 10, value = 3)
Expand Down
Loading

0 comments on commit 02097f3

Please sign in to comment.