Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some minor test corrections #57

Merged
merged 2 commits into from
Aug 15, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -37,16 +37,21 @@ import org.scalatest.{Matchers, PropSpec}

@RunWith(classOf[JUnitRunner])
class NumberTest extends PropSpec with PropertyChecks with Matchers {
val tests = Table(
"TestNumbers",
0.0,
Double.MaxValue,
Double.NegativeInfinity,
Double.NaN

val specials = Table("specials",
Double.MinValue, Double.MaxValue,
Double.MinPositiveValue, Double.NaN,
Double.PositiveInfinity, Double.NegativeInfinity
)

property("validate numbers") {
forAll(tests) { d =>
forAll { d: Double =>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ajayborra this will automatically magnet generator for any arbitrary d: Double

Number.isValid(d) should not be (d.isInfinity || d.isNaN)
}
}

property("validate special numbers") {
forAll(specials) { d =>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ajayborra here we explicitly pass in specials table

Number.isValid(d) should not be (d.isInfinity || d.isNaN)
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,15 +31,17 @@
package com.salesforce.op.utils.spark

import com.salesforce.op.test.TestSparkContext
import com.salesforce.op.utils.date.DateTimeUtils
import org.junit.runner.RunWith
import org.scalatest.FlatSpec
import org.scalatest.junit.JUnitRunner

@RunWith(classOf[JUnitRunner])
class OpSparkListenerTest extends FlatSpec with TestSparkContext {
val start = DateTimeUtils.now().getMillis
val listener = new OpSparkListener(sc.appName, sc.applicationId, "testRun", Some("tag"), Some("tagValue"), true, true)
sc.addSparkListener(listener)
spark.read.csv(s"$testDataDir/PassengerDataAll.csv")
val _ = spark.read.csv(s"$testDataDir/PassengerDataAll.csv").count()
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually running some job here, i.e count rows

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, Saw the changes below


Spec[OpSparkListener] should "capture app metrics" in {
val appMetrics: AppMetrics = listener.metrics
Expand All @@ -48,14 +50,19 @@ class OpSparkListenerTest extends FlatSpec with TestSparkContext {
appMetrics.runType shouldBe "testRun"
appMetrics.customTagName shouldBe Some("tag")
appMetrics.customTagValue shouldBe Some("tagValue")
appMetrics.appStartTime should be >= start
appMetrics.appEndTime should be >= appMetrics.appStartTime
appMetrics.appDuration shouldBe (appMetrics.appEndTime - appMetrics.appStartTime)
appMetrics.appDurationPretty.isEmpty shouldBe false
}

it should "capture app stage metrics" in {
val stageMetrics = listener.metrics.stageMetrics
stageMetrics.size shouldBe 1
stageMetrics.head.name startsWith "csv at OpSparkListenerTest.scala"
stageMetrics.head.stageId shouldBe 0
stageMetrics.head.numTasks shouldBe 1
stageMetrics.head.status shouldBe "succeeded"
stageMetrics.size should be > 0
val firstStage = stageMetrics.head
firstStage.name should startWith("csv at OpSparkListenerTest.scala")
firstStage.stageId shouldBe 0
firstStage.numTasks shouldBe 1
firstStage.status shouldBe "succeeded"
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,9 @@ class RichTupleTest extends FlatSpec with TestCommon {
res.get shouldBe 3
}

it should "not map empty tuples" in {
assertDoesNotCompile("(None, None).map((x, y) => x + y)")
it should "map on empty tuples" in {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(None, None).map actually works if the types are set, so I corrected the test.

val none: (Option[String], Option[String]) = None -> None
none.map((x, y) => x + y) shouldBe None
}

it should "map the function with no effect for left param alone" in {
Expand Down