Skip to content

Commit

Permalink
SHS-NG M2: Store FsHistoryProvider listing data in LevelDB.
Browse files Browse the repository at this point in the history
The application listing is still generated from event logs, but is now stored
in LevelDB. No data (except for the internal LevelDB pages) is kept in memory.
The actual app UIs are, as of now, still untouched.

The provider stores things internally using the public REST API types; I believe
this is better going forward since it will make it easier to get rid of the
internal history server API which is mostly redundant at this point.

I also added a finalizer to LevelDBIterator, to make sure that resources are
eventually released. This helps when code iterates but does not exhaust the
iterator, thus not triggering the auto-close code.

HistoryServerSuite was modified to not re-start the history server unnecessarily;
this makes the json validation tests run more quickly.
  • Loading branch information
Marcelo Vanzin committed May 5, 2017
1 parent b4559a7 commit 1df5e55
Show file tree
Hide file tree
Showing 13 changed files with 465 additions and 323 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -164,6 +164,18 @@ public synchronized void close() throws IOException {
}
}

/**
* Because it's tricky to expose closeable iterators through many internal APIs, especially
* when Scala wrappers are used, this makes sure that, hopefully, the JNI resources held by
* the iterator will eventually be released.
*/
@Override
protected void finalize() throws Throwable {
if (db.db() != null) {
close();
}
}

private T loadNext() {
try {
while (true) {
Expand Down
5 changes: 5 additions & 0 deletions core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,11 @@
<artifactId>spark-launcher_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-kvstore_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-network-common_${scala.binary.version}</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ $(document).ready(function() {
}

$(selector).DataTable(conf);
$('#hisotry-summary [data-toggle="tooltip"]').tooltip();
$('#history-summary [data-toggle="tooltip"]').tooltip();
});
});
});
Original file line number Diff line number Diff line change
Expand Up @@ -68,11 +68,19 @@ private[history] abstract class HistoryUpdateProbe {
* @param ui Spark UI
* @param updateProbe probe to call to check on the update state of this application attempt
*/
private[history] case class LoadedAppUI(
private[spark] case class LoadedAppUI(
ui: SparkUI,
updateProbe: () => Boolean)

private[history] abstract class ApplicationHistoryProvider {
private[spark] abstract class ApplicationHistoryProvider {

/**
* The number of applications available for listing. Separate method in case it's cheaper
* to get a count than to calculate the whole listing.
*
* @return The number of available applications.
*/
def getAppCount(): Int = getListing().size

/**
* Returns the count of application event logs that the provider is currently still processing.
Expand Down
Loading

0 comments on commit 1df5e55

Please sign in to comment.