Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG 1283: erlang_js uses non-thread-safe driver function #15

Merged
merged 1 commit into from Dec 29, 2011

Conversation

slfritchie
Copy link
Contributor

Using driver_output_term() in an async worker pool thread
is a big no-no. A solution is to use driver_output_term()
in the ready_async() function, which executes in the proper
thread context.

Tested via:

  • Run with a debug SMP build of the Erlang VM. There are no
    beam/global.h:1345: Lock check assertion errors.
  • Use the following to run a large number of Map/Reduce jobs to
    try to provoke a memory leak.

The following can be cut-and-paste'd into a Riak console/shell.
On my MBP, it runs for (very roughly) an hour. Perhaps it should
run even longer to try to find very small memory leaks?

{ok, C} = riak:local_client().
[C:put(riak_object:new(<<"test">>, list_to_binary(integer_to_list(X)), list_to_binary(integer_to_list(X)))) || X <- lists:seq(1,500)].
[C:put(riak_object:new(<<"test">>, list_to_binary(integer_to_list(X)) ,<<"98123489123498123498123498123893489712348974123789432789234178942318794213897234178912348791234789412378923417894123789412378941237894123789412387943128794312879123478941237894123789412378941239781243789213487914237891423789142378914234">>)) || X <- lists:seq(1,250)].
io:format("Start time = ~p ~p\n", [time(), now()]), [ spawn(fun() -> {ok, C2} = riak:local_client(), [{ok,[_]} = C2:mapred_bucket(<<"test">>, [{map, {jsfun, <<"Riak.mapValuesJson">>}, none, false}, {reduce, {jsfun, <<"Riak.reduceSum">>}, none, true}]) || _ <- lists:seq(1,6*15)] , if YY == 1 -> os:cmd("say I am done now"), io:format("End time = ~p ~p\n", [time(), now()]); true -> ok end end) || YY <- lists:seq(1,200)].

Using `driver_output_term()` in an async worker pool thread
is a big no-no.  A solution is to use `driver_output_term()`
in the `ready_async()` function, which executes in the proper
thread context.

Tested via:

* Run with a debug SMP build of the Erlang VM.  There are no
  `beam/global.h:1345: Lock check assertion` errors.
* Use the following to run a large number of Map/Reduce jobs to
  try to provoke a memory leak.

The following can be cut-and-paste'd into a Riak console/shell.
On my MBP, it runs for (very roughly) an hour.  Perhaps it should
run even longer to try to find very small memory leaks?

    {ok, C} = riak:local_client().
    [C:put(riak_object:new(<<"test">>, list_to_binary(integer_to_list(X)), list_to_binary(integer_to_list(X)))) || X <- lists:seq(1,500)].
    [C:put(riak_object:new(<<"test">>, list_to_binary(integer_to_list(X)) ,<<"98123489123498123498123498123893489712348974123789432789234178942318794213897234178912348791234789412378923417894123789412378941237894123789412387943128794312879123478941237894123789412378941239781243789213487914237891423789142378914234">>)) || X <- lists:seq(1,250)].
    io:format("Start time = ~p ~p\n", [time(), now()]), [ spawn(fun() -> {ok, C2} = riak:local_client(), [{ok,[_]} = C2:mapred_bucket(<<"test">>, [{map, {jsfun, <<"Riak.mapValuesJson">>}, none, false}, {reduce, {jsfun, <<"Riak.reduceSum">>}, none, true}]) || _ <- lists:seq(1,6*15)] , if YY == 1 -> os:cmd("say I am done now"), io:format("End time = ~p ~p\n", [time(), now()]); true -> ok end end) || YY <- lists:seq(1,200)].
@kellymclaughlin
Copy link
Contributor

These changes look good. Great catch, Scott! +1 to merge.

slfritchie added a commit that referenced this pull request Dec 29, 2011
BUG 1283: erlang_js uses non-thread-safe driver function
@slfritchie slfritchie merged commit a877e7e into master Dec 29, 2011
@seancribbs seancribbs deleted the bz1283-thread-safety branch April 1, 2015 22:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants