Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module rearange and add kotlin benchmark #950

Merged
merged 16 commits into from
Mar 18, 2024

Conversation

jinia91
Copy link
Contributor

@jinia91 jinia91 commented Mar 16, 2024

Summary

I suggest some modifications to the current structure to facilitate benchmark testing of the fixture-monkey-kotlinmodule using JMH

  1. To ensure consistent results, use the same class and use the java-test-fixtures plugin for code reuse.

  2. make new module fixture-monkey-benchmarks for aggregating sub-benchmark-module.

  3. Due to some local compile issues when defining Java classes within the Kotlin module, move the ExpressionGeneratorJavaTestSpecs.java to a Java package in fixture-monkey-kotlin.

  4. Add default benchmark samples to the fixture-monkey-kotlin module.

     For a proper comparison, the sample test cases are planned as follows:
     
     1. Benchmark for initializing Java classes in Kotlin
     2. Benchmark for initializing Kotlin classes in Kotlin
     3. Benchmark for initializing composite classes with Kotlin-Java internal references (todo)
    
  5. Modify the GitHub Actions workflow for benchmark reporting.

How Has This Been Tested?

Local Build and Ci Pipline

Is the Document updated?

no need

@jinia91 jinia91 closed this Mar 16, 2024
@jinia91 jinia91 reopened this Mar 16, 2024
@jinia91 jinia91 changed the title Module reArange for kotlin benchmark Module rearange and add kotlin benchmark Mar 16, 2024
@seongahjo
Copy link
Contributor

How about adding a new module to benchmark?

@jinia91
Copy link
Contributor Author

jinia91 commented Mar 16, 2024

How about adding a new module to benchmark?

Im considering two approaches

  • first, make new module for benchmark integration testing
  • second, performing benchmark testing for each module separately and collect just reporting.

From the perspective of scalability Flexibility , choice second approach.

However, I'm not entirely confident due to the downside of increased dependency complexity.

@jinia91
Copy link
Contributor Author

jinia91 commented Mar 16, 2024

fixture-monkey Benchmark Mode Threads Samples Score Score Error (99.9%) Unit
ManipulationBenchmark.fixed avgt 1 10 576.165762 2.457355 ms/op
ManipulationBenchmark.thenApply avgt 1 10 1301.278260 5.921343 ms/op
ObjectGenerationBenchmark.beanGenerateOrderSheetWithFixtureMonkey avgt 1 10 596.718850 5.575784 ms/op
ObjectGenerationBenchmark.builderGenerateOrderSheetWithFixtureMonkey avgt 1 10 505.365829 3.092013 ms/op
ObjectGenerationBenchmark.fieldReflectionGenerateOrderSheetWithFixtureMonkey avgt 1 10 572.670919 6.588839 ms/op
ObjectGenerationBenchmark.jacksonGenerateOrderSheetWithFixtureMonkey avgt 1 10 607.734411 4.826412 ms/op
fixture-monkey-kotlin Module Benchmark Mode Threads Samples Score Score Error (99.9%) Unit
KotlinObjectGenerationBenchMark.beanGenerateJavaOrderSheetWithFixtureMonkey avgt 1 10 1029.619206 21.428195 ms/op
KotlinObjectGenerationBenchMark.beanGenerateKotlinOrderSheetWithFixtureMonkey avgt 1 10 980.861308 14.237564 ms/op

@jinia91 jinia91 marked this pull request as ready for review March 16, 2024 07:19
@seongahjo
Copy link
Contributor

I'm sure you've given it a lot of thought.
Can you tell me about a specific situation where flexibility is critical?
I have no idea so far. I'm not sure about it is worth than complex dependency.

@jinia91
Copy link
Contributor Author

jinia91 commented Mar 16, 2024

Currently, the execution speed of performance bench check feels too slow, leading to a decrease in productivity.

With the current PR, as adding just two of kotlin benchmarking, there has been an increase of over 4 minutes.

As fixture-monkey continues to expand with more tests being added, i expect to take more times in future.

so, in near future, try to various pipelines such as:

  • Running only some benchmarks based on module changes
  • Executing each benchmark in parallel
    and so on...etc..

are expected to be considered.

In such cases, big one module for benchmark might lead to break changes.

Of course, this is just speculative in my opinion, and the problems of increased dependency complexity and management points cannot be ignored.

@seongahjo
Copy link
Contributor

seongahjo commented Mar 16, 2024

Oh, I see, thank you for your detailed explanation.
Yes, I agree that a large module leads to bad for performance, which is bad for productivity.
Then how about a new module that contains sub benchmark modules like fixture-monkey-tests?

@jinia91
Copy link
Contributor Author

jinia91 commented Mar 16, 2024

Oh, that sounds good.

There seems to be no need to place the benchmark testing module under the basic module structure unnecessarily.
I will change the direction to manage under the fixture-monkey-benchmark each sub benchmark modules.

Thank you for the great idea.

@jinia91 jinia91 marked this pull request as draft March 16, 2024 08:45
@@ -174,4 +174,3 @@ public enum BundleType {
IDENTICAL_PRODUCT
}
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

EOL is missing now.

@@ -180,4 +180,3 @@ public enum BundleType {
IDENTICAL_PRODUCT
}
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

EOL is missing now.

@@ -10,6 +10,7 @@ dependencies {
implementation("org.jetbrains.kotlin:kotlin-reflect:${Versions.KOTLIN}")

testImplementation(project(":fixture-monkey-jackson"))
testImplementation(project(":fixture-monkey-javax-validation"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this dependency necessary?

Copy link
Contributor Author

@jinia91 jinia91 Mar 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the ExpressionGeneratorJavaTestSpecs, javax validation was previously being used.

Even on the main branch, compile was not successful in my local environment, the exact reason is unclear.

After changing modules, ci compile fails in the absence of the dependency, so it necessary in the current situation.

@@ -16,7 +16,7 @@
* limitations under the License.
*/

package com.navercorp.fixturemonkey.kotlin;
package com.navercorp.fixturemonkey.kotlin.spec;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this change has nothing to do with this PR.

Comment on lines 21 to 22
import com.navercorp.fixturemonkey.kotlin.spec.ExpressionGeneratorJavaTestSpecs.DogJava
import com.navercorp.fixturemonkey.kotlin.spec.ExpressionGeneratorJavaTestSpecs.PersonJava
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this change has nothing to do with this PR.

Copy link
Contributor

@seongahjo seongahjo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only minor convention changes are required.
Nice work 👍

@seongahjo seongahjo merged commit f670cf8 into naver:main Mar 18, 2024
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants