Skip to content

Commit

Permalink
Fix UB in Tokenizer
Browse files Browse the repository at this point in the history
  • Loading branch information
tgoyne committed May 31, 2024
1 parent a77320b commit 2ed0f28
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
* The encryption code no longer behaves differently depending on the system page size, which should entirely eliminate a recurring source of bugs related to copying encrypted Realm files between platforms with different page sizes. One known outstanding bug was ([RNET-1141](https://github.com/realm/realm-dotnet/issues/3592)), where opening files on a system with a larger page size than the writing system would attempt to read sections of the file which had never been written to ([PR #7698](https://github.com/realm/realm-core/pull/7698)).
* There were several complicated scenarios which could result in stale reads from encrypted files in multiprocess scenarios. These were very difficult to hit and would typically lead to a crash, either due to an assertion failure or DecryptionFailure being thrown ([PR #7698](https://github.com/realm/realm-core/pull/7698), since v13.9.0).
* Encrypted files have some benign data races where we can memcpy a block of memory while another thread is writing to a limited range of it. It is logically impossible to ever read from that range when this happens, but Thread Sanitizer quite reasonably complains about this. We now perform a slower operations when running with TSan which avoids this benign race ([PR #7698](https://github.com/realm/realm-core/pull/7698)).
* Tokenizing strings for full-text search could pass values outside the range [-1, 255] to `isspace()`, which is undefined behavior ([PR #7698](https://github.com/realm/realm-core/pull/7698), since the introduction of FTS in v13.0.0).

### Breaking changes
* None.
Expand Down
2 changes: 1 addition & 1 deletion src/realm/tokenizer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ std::pair<std::set<std::string>, std::set<std::string>> Tokenizer::get_search_to
}
};
for (; m_cur_pos != m_end_pos; m_cur_pos++) {
if (isspace(*m_cur_pos)) {
if (isspace(static_cast<unsigned char>(*m_cur_pos))) {
add_token();
}
else {
Expand Down

0 comments on commit 2ed0f28

Please sign in to comment.