Permalink
Browse files

Fix race when calling `HH\facts_parse` in multiple requests

Summary:

Problem:

 - each call tries to allocate pool_size workers
 - we allocate a hackc on thread enter
 - we deallocate the hackc when we deallocate the worker object
 - this happens after all threads are finished
 - this means that at some point, we must have the entire pool of hackc
 workers
 - if two requests get any, no-one gets all

Fix:
 - deallocate each hackc when the thread finishes its' work

Thanks to @vladima for the help fixing this.

Fixes #8259

Test plan:

```Hack
<?hh

HH\facts_parse('/', [__FILE__], /* force_hh = */ false, /* multithreaded = */ true);

echo "factsparse test\n";
```

```
 ~/code/hhvm/hphp/hhvm/hhvm --no-config -m server -p 8080
```

```
$ curl http://localhost:8080/test.php
factsparse test
$ ab -c 100 -n 1000 http://localhost:8080/test.php
```

This previously reliably deadlocked with `-c 2 -n 10`
  • Loading branch information...
fredemmott committed Aug 10, 2018
1 parent bbba234 commit 4089ed8b5042b2b1781b833653472b48acc6f07f
Showing with 9 additions and 0 deletions.
  1. +9 −0 hphp/runtime/ext/factparse/ext_factparse.cpp
@@ -172,6 +172,7 @@ struct ParseFactsWorker: public BaseWorker<T> {
m_state = T::init_state();
}
void onThreadExit() override {
m_state = T::clear_state();
hphp_context_exit();
hphp_session_exit();
}
@@ -328,6 +329,10 @@ struct HHVMFactsExtractor {
return nullptr;
}
static state_type clear_state() {
return nullptr;
}
static void parse_stream(
std::istream& stream,
const std::string& path,
@@ -393,6 +398,10 @@ struct HackCFactsExtractor {
return acquire_facts_parser();
}
static state_type clear_state() {
return std::unique_ptr<FactsParser>();
}
static void mark_failed(result_type& workerResult) {
workerResult = folly::none;
}

0 comments on commit 4089ed8

Please sign in to comment.