Skip to content
This repository has been archived by the owner on Nov 15, 2024. It is now read-only.

Commit

Permalink
[SPARK-22120][SQL] TestHiveSparkSession.reset() should clean out Hive…
Browse files Browse the repository at this point in the history
… warehouse directory

## What changes were proposed in this pull request?
During TestHiveSparkSession.reset(), which is called after each TestHiveSingleton suite, we now delete and recreate the Hive warehouse directory.

## How was this patch tested?
Ran full suite of tests locally, verified that they pass.

Author: Greg Owen <greg@databricks.com>

Closes apache#19341 from GregOwen/SPARK-22120.

(cherry picked from commit ce20478)
Signed-off-by: gatorsmile <gatorsmile@gmail.com>
  • Loading branch information
Greg Owen authored and MatthewRBruce committed Jul 31, 2018
1 parent 1ea4034 commit f371564
Showing 1 changed file with 6 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
package org.apache.spark.sql.hive.test

import java.io.File
import java.net.URI
import java.util.{Set => JavaSet}

import scala.collection.JavaConverters._
Expand Down Expand Up @@ -486,6 +487,11 @@ private[hive] class TestHiveSparkSession(
}
}

// Clean out the Hive warehouse between each suite
val warehouseDir = new File(new URI(sparkContext.conf.get("spark.sql.warehouse.dir")).getPath)
Utils.deleteRecursively(warehouseDir)
warehouseDir.mkdir()

sharedState.cacheManager.clearCache()
loadedTables.clear()
sessionState.catalog.reset()
Expand Down

0 comments on commit f371564

Please sign in to comment.