-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Import, export and cluster search functions for drain models #5
Conversation
Cool let me check that, let's hope this week. Thank you very much for this contribution. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, thank you so much for implementing this !
I have a few remarks that I would like to be adressed, aside form the code comments, I have a few other things that comes to mind:
- The serialized JSON file, represents the internal state of the Drain structure, I would like to change the semantic from import / export to load state / save state
- Since this is an internal state I'm wondering if the file should contain, some compatibility information, like a version.
- I want the serialization code separated form the main algorithm.
- Currently there is no command line flag state.
I can take over if you are fine with it !
implementation("com.fasterxml.jackson.core:jackson-core:2.4.1") | ||
implementation("com.fasterxml.jackson.core:jackson-annotations:2.4.1") | ||
implementation("com.fasterxml.jackson.core:jackson-databind:2.4.1") | ||
annotationProcessor("info.picocli:picocli-codegen:4.6.1") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you use the latest jackson versions 2.12.2. The 2.4.1 are pretty old, more than 5years probably, and not the latest fix release.
@@ -31,6 +31,8 @@ | |||
class MappedFileLineReaderTest { | |||
private final Path resourceDirectory = Paths.get("src", "test", "resources"); | |||
|
|||
private final String testFile = "3-lines.txt"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather not change this test for now, even iof the value is the same.
@@ -45,7 +45,7 @@ void smokeTest() throws IOException { | |||
.sorted(Comparator.comparing(LogCluster::sightings).reversed()) | |||
.forEach(System.out::println); | |||
|
|||
// assertThat(drain.clusters()).hasSize(51); | |||
assertThat(drain.clusters()).hasSize(51); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
/** | ||
* Drain-object exporting functionality which saves a drain model | ||
* in a json file at given path. | ||
*/ | ||
public void drainExport(String filePath) { | ||
// Instance an Object Mapper and allow the access to the Drain object attributes | ||
ObjectMapper objectMapper = new ObjectMapper(); | ||
objectMapper.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY); | ||
|
||
// Write the Drain object attributes into path file | ||
try { | ||
System.out.printf("---- Saving drain model in file %s %n", | ||
filePath); | ||
objectMapper.writeValue(new FileOutputStream(filePath), this); | ||
} catch (Exception e) { | ||
e.printStackTrace(); | ||
} | ||
} | ||
|
||
/** | ||
* Drain-object importing functionality which creates a Drain instance from the input | ||
* json filepath containing Drain object attributes. | ||
*/ | ||
public static Drain drainImport(String filePath) throws IOException { | ||
// Create an Object mapper | ||
ObjectMapper objectMapper = new ObjectMapper(); | ||
|
||
// Setup the input .json file | ||
File inputDrain = new File(filePath); | ||
|
||
System.out.printf("---- Recovering drain model from file %s %n", | ||
filePath); | ||
// Map the attributes from the file and return the Drain instance | ||
return objectMapper.readValue(inputDrain, Drain.class); | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would prefer if this state serialization code remained external to Drain, to keep this class focused on the algorithm itself.
@@ -104,6 +206,9 @@ public void parseLogMessage(@Nonnull String message) { | |||
// add the log to an existing cluster | |||
matchCluster.newSighting(contentTokens); | |||
} | |||
//System.out.println("Found node: "+matchCluster); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleanup ;)
* Returns the matching log cluster given a log message (null if no matchings) | ||
* | ||
*/ | ||
public LogCluster findLogMessage(@Nonnull String message) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
findLogMessage -> findLogCluster
Hi @TodorKrIv Thanks again for bootstraping the code, I took over in #13. You test cases were useful! It's a nice contribution! I find the |
I included import and export functionalities to allow the loading and saving of Drain models using Jackson library. To do so, I designed some required constructors (Drain, LogCluster and Node) and a log clusters linker for a properly working import function. Moreover, I added a findLogMessage function to find the cluster for an input log message, similar to parseLogMessage, but avoiding modifications over the model.
Also, I added a new test file(ImportExportDrainTest.java) to verify that both import and export functions work correctly. Some tests for the cluster finding and drain model updating (given an input log message) are included aswell.