Skip to content

Commit

Permalink
[SPARK-22120][SQL] TestHiveSparkSession.reset() should clean out Hive…
Browse files Browse the repository at this point in the history
… warehouse directory

## What changes were proposed in this pull request?
During TestHiveSparkSession.reset(), which is called after each TestHiveSingleton suite, we now delete and recreate the Hive warehouse directory.

## How was this patch tested?
Ran full suite of tests locally, verified that they pass.

Author: Greg Owen <greg@databricks.com>

Closes apache#19341 from GregOwen/SPARK-22120.
  • Loading branch information
Greg Owen authored and gatorsmile committed Sep 25, 2017
1 parent 038b185 commit ce20478
Showing 1 changed file with 6 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
package org.apache.spark.sql.hive.test

import java.io.File
import java.net.URI
import java.util.{Set => JavaSet}

import scala.collection.JavaConverters._
Expand Down Expand Up @@ -498,6 +499,11 @@ private[hive] class TestHiveSparkSession(
}
}

// Clean out the Hive warehouse between each suite
val warehouseDir = new File(new URI(sparkContext.conf.get("spark.sql.warehouse.dir")).getPath)
Utils.deleteRecursively(warehouseDir)
warehouseDir.mkdir()

sharedState.cacheManager.clearCache()
loadedTables.clear()
sessionState.catalog.reset()
Expand Down

0 comments on commit ce20478

Please sign in to comment.