Skip to content

Conversation

@DenverM80
Copy link
Contributor

@DenverM80 DenverM80 commented May 16, 2017

   968 ClientNum[1], Log Message:   GlibThread[0] BEGIN xfer File[perf_obj_00000] Chunk[1]
   969 ClientNum[2], Log Message:   GlibThread[0] BEGIN xfer File[perf_obj_00000] Chunk[1]
   970 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[0]
   971 ClientNum[3], Log Message:   GlibThread[0] BEGIN xfer File[perf_obj_00000] Chunk[1]
   972 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[16384]
   973 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[32768]
   974 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[49152]
   975 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[65536]
   976 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[0]
   977 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[16384]
   978 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[81920]
   979 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[32768]
   980 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[49152]
   981 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[98304]
   982 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[114688]
   983 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[131072]
   984 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[147456]
   985 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[163840]
   986 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[180224]
   987 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[196608]
   988 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[212992]
   989 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[65536]
   990 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[229376]
   991 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[81920]
   992 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[98304]
   993 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[245760]
   994 ClientNum[1], Log Message: ThreadNum[0] [bulk_put_performance_bucket1:perf_obj_00000] total_read[0]
   995 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[114688]
   996 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[262144]
   997 ClientNum[1], Log Message: ThreadNum[0] [bulk_put_performance_bucket1:perf_obj_00000] total_read[16384]
   998 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[131072]
   999 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[278528]
  1000 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[147456]
  1001 ClientNum[3], Log Message: ThreadNum[0] [bulk_put_performance_bucket3:perf_obj_00000] total_read[163840]
  1002 ClientNum[2], Log Message: ThreadNum[0] [bulk_put_performance_bucket2:perf_obj_00000] total_read[294912]
.
.
.
 ==22803== LEAK SUMMARY:
 ==22803==    definitely lost: 0 bytes in 0 blocks
 ==22803==    indirectly lost: 0 bytes in 0 blocks
 ==22803==      possibly lost: 0 bytes in 0 blocks
 ==22803==    still reachable: 2,346 bytes in 12 blocks
 ==22803==         suppressed: 0 bytes in 0 blocks
 ==22803==.
 ==22803== For counts of detected and suppressed errors, rerun with: -v
 ==22803== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)
 [100%] Built target mem

@DenverM80 DenverM80 force-pushed the concurrent_xfers_test branch 2 times, most recently from 401a737 to 46f1339 Compare May 31, 2017 23:05
…performance test; default to CMake release builds in POSIX
@DenverM80 DenverM80 force-pushed the concurrent_xfers_test branch from d567bc5 to 30da67f Compare June 8, 2017 01:58
g_mutex_lock(&pool->mutex);
curl_easy_reset(connection);
pool->tail = _pool_inc(pool->tail, pool->num_connections);
printf(" release connection [%d]\n", pool->tail);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason this is a printf rather than a ds3_log_message?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ds3_log_message requires the client, which we don't have here, and this msg will come back out before final commit. I just added this to visually verify that the lock / unlocking is happening asynchronously

@DenverM80 DenverM80 force-pushed the concurrent_xfers_test branch from 87cc5d7 to e80f8f8 Compare June 15, 2017 21:49
}

ds3_str* ds3_str_dup(const ds3_str* string) {
if (string == NULL) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for null check

@GraciesPadre GraciesPadre merged commit 65b5ed4 into SpectraLogic:master Jun 16, 2017
DenverM80 added a commit to SpectraLogic/ds3_autogen that referenced this pull request Jun 16, 2017
@DenverM80 DenverM80 deleted the concurrent_xfers_test branch July 28, 2017 19:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants