Skip to content

Commit

Permalink
Merge pull request intel-analytics#5 from hhbyyh/refactor
Browse files Browse the repository at this point in the history
refine inference process
  • Loading branch information
megaSpoon committed May 22, 2017
2 parents 7d199e7 + 4c0fdae commit 6d86cbd
Show file tree
Hide file tree
Showing 60 changed files with 233 additions and 157 deletions.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
43 changes: 43 additions & 0 deletions pipeline/dataSample/dev-clean/mapping.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
1462-170142-0000.flac THE LAST TWO DAYS OF THE VOYAGE BARTLEY FOUND ALMOST INTOLERABLE
1462-170142-0001.flac EMERGING AT EUSTON AT HALF PAST THREE O'CLOCK IN THE AFTERNOON ALEXANDER HAD HIS LUGGAGE SENT TO THE SAVOY AND DROVE AT ONCE TO BEDFORD SQUARE
1462-170142-0002.flac SHE BLUSHED AND SMILED AND FUMBLED HIS CARD IN HER CONFUSION BEFORE SHE RAN UPSTAIRS
1462-170142-0003.flac THE ROOM WAS EMPTY WHEN HE ENTERED
1462-170142-0004.flac A COAL FIRE WAS CRACKLING IN THE GRATE AND THE LAMPS WERE LIT FOR IT WAS ALREADY BEGINNING TO GROW DARK OUTSIDE
1462-170142-0005.flac SHE CALLED HIS NAME ON THE THRESHOLD BUT IN HER SWIFT FLIGHT ACROSS THE ROOM SHE FELT A CHANGE IN HIM AND CAUGHT HERSELF UP SO DEFTLY THAT HE COULD NOT TELL JUST WHEN SHE DID IT
1462-170142-0006.flac SHE MERELY BRUSHED HIS CHEEK WITH HER LIPS AND PUT A HAND LIGHTLY AND JOYOUSLY ON EITHER SHOULDER
1462-170142-0007.flac I NEVER DREAMED IT WOULD BE YOU BARTLEY
1462-170142-0008.flac WHEN DID YOU COME BARTLEY AND HOW DID IT HAPPEN YOU HAVEN'T SPOKEN A WORD
1462-170142-0009.flac SHE LOOKED AT HIS HEAVY SHOULDERS AND BIG DETERMINED HEAD THRUST FORWARD LIKE A CATAPULT IN LEASH
1462-170142-0010.flac I'LL DO ANYTHING YOU WISH ME TO BARTLEY SHE SAID TREMULOUSLY
1462-170142-0011.flac HE PULLED UP A WINDOW AS IF THE AIR WERE HEAVY
1462-170142-0012.flac HILDA WATCHED HIM FROM HER CORNER TREMBLING AND SCARCELY BREATHING DARK SHADOWS GROWING ABOUT HER EYES
1462-170142-0013.flac IT IT HASN'T ALWAYS MADE YOU MISERABLE HAS IT
1462-170142-0014.flac ALWAYS BUT IT'S WORSE NOW
1462-170142-0015.flac IT'S UNBEARABLE IT TORTURES ME EVERY MINUTE
1462-170142-0016.flac I AM NOT A MAN WHO CAN LIVE TWO LIVES HE WENT ON FEVERISHLY EACH LIFE SPOILS THE OTHER
1462-170142-0017.flac I GET NOTHING BUT MISERY OUT OF EITHER
1462-170142-0018.flac THERE IS THIS DECEPTION BETWEEN ME AND EVERYTHING
1462-170142-0019.flac AT THAT WORD DECEPTION SPOKEN WITH SUCH SELF CONTEMPT THE COLOR FLASHED BACK INTO HILDA'S FACE AS SUDDENLY AS IF SHE HAD BEEN STRUCK BY A WHIPLASH
1462-170142-0020.flac SHE BIT HER LIP AND LOOKED DOWN AT HER HANDS WHICH WERE CLASPED TIGHTLY IN FRONT OF HER
1462-170142-0021.flac COULD YOU COULD YOU SIT DOWN AND TALK ABOUT IT QUIETLY BARTLEY AS IF I WERE A FRIEND AND NOT SOME ONE WHO HAD TO BE DEFIED
1462-170142-0022.flac HE DROPPED BACK HEAVILY INTO HIS CHAIR BY THE FIRE
1462-170142-0023.flac I HAVE THOUGHT ABOUT IT UNTIL I AM WORN OUT
1462-170142-0024.flac AFTER THE VERY FIRST
1462-170142-0025.flac HILDA'S FACE QUIVERED BUT SHE WHISPERED YES I THINK IT MUST HAVE BEEN
1462-170142-0026.flac SHE PRESSED HIS HAND GENTLY IN GRATITUDE WEREN'T YOU HAPPY THEN AT ALL
1462-170142-0027.flac SOMETHING OF THEIR TROUBLING SWEETNESS CAME BACK TO ALEXANDER TOO
1462-170142-0028.flac PRESENTLY IT STOLE BACK TO HIS COAT SLEEVE
1462-170142-0029.flac YES HILDA I KNOW THAT HE SAID SIMPLY
1462-170142-0030.flac I UNDERSTAND BARTLEY I WAS WRONG
1462-170142-0031.flac SHE LISTENED INTENTLY BUT SHE HEARD NOTHING BUT THE CREAKING OF HIS CHAIR
1462-170142-0032.flac YOU WANT ME TO SAY IT SHE WHISPERED
1462-170142-0033.flac BARTLEY LEANED HIS HEAD IN HIS HANDS AND SPOKE THROUGH HIS TEETH
1462-170142-0034.flac IT'S GOT TO BE A CLEAN BREAK HILDA
1462-170142-0035.flac OH BARTLEY WHAT AM I TO DO
1462-170142-0036.flac YOU ASK ME TO STAY AWAY FROM YOU BECAUSE YOU WANT ME
1462-170142-0037.flac I WILL ASK THE LEAST IMAGINABLE BUT I MUST HAVE SOMETHING
1462-170142-0038.flac HILDA SAT ON THE ARM OF IT AND PUT HER HANDS LIGHTLY ON HIS SHOULDERS
1462-170142-0039.flac YOU SEE LOVING SOME ONE AS I LOVE YOU MAKES THE WHOLE WORLD DIFFERENT
1462-170142-0040.flac AND THEN YOU CAME BACK NOT CARING VERY MUCH BUT IT MADE NO DIFFERENCE
1462-170142-0041.flac SHE SLID TO THE FLOOR BESIDE HIM AS IF SHE WERE TOO TIRED TO SIT UP ANY LONGER
1462-170142-0042.flac DON'T CRY DON'T CRY HE WHISPERED
5 changes: 5 additions & 0 deletions pipeline/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,11 @@
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>nexus</id>
<name>nexus Repository</name>
<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
</repository>
</repositories>

<properties>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,21 +15,19 @@
*/
package com.intel.analytics.bigdl.nn

import com.intel.analytics.bigdl.nn._
import com.intel.analytics.bigdl.nn.abstractnn.TensorModule
import com.intel.analytics.bigdl.tensor.{DoubleType, FloatType, Tensor}
import com.intel.analytics.bigdl.tensor.TensorNumericMath.TensorNumeric

import scala.reflect.ClassTag

import com.intel.analytics.bigdl.tensor.Tensor
import com.intel.analytics.bigdl.tensor.TensorNumericMath.TensorNumeric

@SerialVersionUID( - 467695939363389565L)
class BatchNormalizationDS[@specialized(Float, Double) T: ClassTag](
nOutput: Int,
eps: Double = 1e-3,
momentum: Double = 0.1,
affine: Boolean = false)
(implicit ev: TensorNumeric[T])
extends BatchNormalization[T](nOutput, eps, momentum, affine) {
nOutput: Int,
eps: Double = 1e-3,
momentum: Double = 0.1,
affine: Boolean = false)
(implicit ev: TensorNumeric[T]
) extends BatchNormalization[T](nOutput, eps, momentum, affine) {

val batchDim = 0
val timeDim = 1
Expand Down Expand Up @@ -97,8 +95,8 @@ class BatchNormalizationDS[@specialized(Float, Double) T: ClassTag](

object BatchNormalizationDS {
def apply[@specialized(Float, Double) T: ClassTag](
nOutput: Int,
eps: Double = 1e-5) (implicit ev: TensorNumeric[T]): BatchNormalizationDS[T] = {
nOutput: Int,
eps: Double = 1e-5) (implicit ev: TensorNumeric[T]): BatchNormalizationDS[T] = {
new BatchNormalizationDS[T](nOutput, eps)
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@
*/
package com.intel.analytics.bigdl.nn

import com.intel.analytics.bigdl.nn._
import com.intel.analytics.bigdl.nn.abstractnn.{AbstractModule, Activity}
import com.intel.analytics.bigdl.tensor.Tensor
import com.intel.analytics.bigdl.tensor.TensorNumericMath.TensorNumeric
Expand All @@ -25,8 +24,9 @@ import scala.reflect.ClassTag

@SerialVersionUID( - 643193217505024792L)
class BiRecurrentDS[T : ClassTag](
merge: AbstractModule[Table, Tensor[T], T] = null, isCloneInput: Boolean = true)
(implicit ev: TensorNumeric[T]) extends Container[Tensor[T], Tensor[T], T] {
merge: AbstractModule[Table, Tensor[T], T] = null,
isCloneInput: Boolean = true) (implicit ev: TensorNumeric[T])
extends Container[Tensor[T], Tensor[T], T] {

val timeDim = 2
val featDim = 3
Expand Down Expand Up @@ -104,7 +104,6 @@ class BiRecurrentDS[T : ClassTag](

override def canEqual(other: Any): Boolean = other.isInstanceOf[BiRecurrentDS[T]]


/**
* Clear cached activities to save storage space or network bandwidth. Note that we use
* Tensor.set to keep some information like tensor share
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,13 @@
*/
package com.intel.analytics.bigdl.nn

import com.intel.analytics.bigdl.nn.SplitTable
import scala.reflect.ClassTag

import com.intel.analytics.bigdl.nn.abstractnn.AbstractModule
import com.intel.analytics.bigdl.tensor.Tensor
import com.intel.analytics.bigdl.tensor.TensorNumericMath.TensorNumeric
import com.intel.analytics.bigdl.utils.Table

import scala.reflect.ClassTag

/**
* Creates a module that takes a Tensor as input and
* outputs two tables, splitting the Tensor along
Expand Down Expand Up @@ -65,10 +64,8 @@ class BifurcateSplitTable[T: ClassTag](
gradInput
}


override def canEqual(other: Any): Boolean = other.isInstanceOf[SplitTable[T]]


override def toString: String = s"BifurcateSplitTable($dimension)"

override def equals(other: Any): Boolean = other match {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,11 @@
*/
package com.intel.analytics.bigdl.nn

import scala.reflect.ClassTag

import com.intel.analytics.bigdl.nn.abstractnn.TensorModule
import com.intel.analytics.bigdl.tensor.TensorNumericMath.TensorNumeric
import com.intel.analytics.bigdl.tensor._
import com.intel.analytics.bigdl.utils.Engine

import scala.reflect.ClassTag

/**
* Applies HardTanh to each element of input, HardTanh is defined:
Expand All @@ -34,11 +33,10 @@ import scala.reflect.ClassTag
*/

class HardTanhDS[T: ClassTag](
val minValue: Double = -1,
val maxValue: Double = 1,
val inplace: Boolean = false
)(implicit ev: TensorNumeric[T])
extends TensorModule[T] {
val minValue: Double = -1,
val maxValue: Double = 1,
val inplace: Boolean = false
) (implicit ev: TensorNumeric[T]) extends TensorModule[T] {
require(maxValue > minValue, "maxValue must be larger than minValue")

val min = ev.fromType[Double](minValue)
Expand All @@ -47,8 +45,7 @@ class HardTanhDS[T: ClassTag](
override def updateOutput(input: Tensor[T]): Tensor[T] = {
if (inplace) {
output.set(input)
}
else {
} else {
output.resizeAs(input)
}

Expand Down Expand Up @@ -111,8 +108,6 @@ class HardTanhDS[T: ClassTag](
output
}



override def updateGradInput(input: Tensor[T], gradOutput: Tensor[T]): Tensor[T] = {
require(input.nElement() == gradOutput.nElement(),
"the number of input element should equal the number of gradOutput element")
Expand Down Expand Up @@ -186,9 +181,9 @@ class HardTanhDS[T: ClassTag](

object HardTanhDS {
def apply[@specialized(Float, Double) T: ClassTag](
minValue: Double = -1,
maxValue: Double = 1,
inplace: Boolean = false)(implicit ev: TensorNumeric[T]) : HardTanhDS[T] = {
minValue: Double = -1,
maxValue: Double = 1,
inplace: Boolean = false)(implicit ev: TensorNumeric[T]) : HardTanhDS[T] = {
new HardTanhDS[T](minValue, maxValue, inplace)
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,10 @@ import scala.reflect.ClassTag

@SerialVersionUID( 5237686508074490666L)
class RnnCellDS[T : ClassTag](
inputSize: Int = 4,
hiddenSize: Int = 3,
activation: TensorModule[T],
private var initMethod: InitializationMethod = Default)
(implicit ev: TensorNumeric[T])
inputSize: Int = 4,
hiddenSize: Int = 3,
activation: TensorModule[T],
private var initMethod: InitializationMethod = Default) (implicit ev: TensorNumeric[T])
extends Cell[T](Array(hiddenSize)) {

val parallelTable = ParallelTable[T]()
Expand Down Expand Up @@ -145,10 +144,10 @@ class RnnCellDS[T : ClassTag](

object RnnCellDS {
def apply[@specialized(Float, Double) T: ClassTag](
inputSize: Int = 4,
hiddenSize: Int = 3,
activation: TensorModule[T])
(implicit ev: TensorNumeric[T]) : RnnCellDS[T] = {
inputSize: Int = 4,
hiddenSize: Int = 3,
activation: TensorModule[T])
(implicit ev: TensorNumeric[T]) : RnnCellDS[T] = {
new RnnCellDS[T](inputSize, hiddenSize, activation)
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -15,24 +15,21 @@
*/
package com.intel.analytics.bigdl.nn

import scala.reflect.ClassTag

import com.intel.analytics.bigdl.nn._
import com.intel.analytics.bigdl.nn.abstractnn.TensorModule
import com.intel.analytics.bigdl.tensor.Tensor
import com.intel.analytics.bigdl.tensor.TensorNumericMath.TensorNumeric

import scala.reflect.ClassTag

/**
* Reverse the input w.r.t given dimension.
* The input can be a Tensor or Table.
* @param dim
* @param ev
* @tparam T Numeric type. Only support float/double now
*/
class ReverseDS[T: ClassTag](dim: Int = 1)
(implicit ev: TensorNumeric[T])
extends TensorModule[T] {
class ReverseDS[T: ClassTag](dim: Int = 1) (implicit ev: TensorNumeric[T])
extends TensorModule[T] {

val buffer = Tensor[T]()

Expand Down Expand Up @@ -90,7 +87,7 @@ class ReverseDS[T: ClassTag](dim: Int = 1)

object ReverseDS {
def apply[@specialized(Float, Double) T: ClassTag](
dimension: Int = 1)(implicit ev: TensorNumeric[T]) : ReverseDS[T] = {
dimension: Int = 1)(implicit ev: TensorNumeric[T]) : ReverseDS[T] = {
new ReverseDS[T](dimension)
}
}
Loading

0 comments on commit 6d86cbd

Please sign in to comment.