Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.Sign up
Measure Time to First byte in JDBCSampler #211
I use JMeter to measure performance of a database. When executing queries returning a lot of rows, the time taken by JMeter to read the ResultSet and storing the result in a StringBuilder then calling toString() might be bigger than the execution time on the server.
In order to easily understand what's taking time for slow queries, I thought it might be interesting to use "Latency" and "Connect Time" of the SampleResult to show two informations:
What do you think of that change?
- Use connect time to instead of latency to measure connection time - Use latency to measure the time at which the first ResultSet (or whatever) is received from the connection