You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this work, we introduce Boolformer, the first Transformer architecturetrained to perform end-to-end symbolic regression of Boolean functions. First,we show that it can predict compact formulas for complex functions which werenot seen during training, when provided a clean truth table. Then, wedemonstrate its ability to find approximate expressions when providedincomplete and noisy observations. We evaluate the Boolformer on a broad set ofreal-world binary classification datasets, demonstrating its potential as aninterpretable alternative to classic machine learning methods. Finally, weapply it to the widespread task of modelling the dynamics of gene regulatorynetworks. Using a recent benchmark, we show that Boolformer is competitive withstate-of-the art genetic algorithms with a speedup of several orders ofmagnitude. Our code and models are available publicly.
AkihikoWatanabe
changed the title
あ
Boolformer: Symbolic Regression of Logic Functions with Transformers, Stéphane d'Ascoli+, N/A, arXiv'23
Oct 9, 2023
URL
Affiliations
Abstract
Translation (by gpt-3.5-turbo)
Summary (by gpt-3.5-turbo)
The text was updated successfully, but these errors were encountered: