You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using PriorAcquisitionFunction. Based on https://arxiv.org/pdf/2204.11051.pdf, the weight of the user-defined prior is decreased exponentially after each iteration/ optimization step. I noticed that the variable self._iteration_number in class PriorAcquisitionFunction is updated each time the acquisition function is used. However, because the acquisition function is not used at every step (multiple configurations are proposed after evaluating the acquisition function), self._iteration_number is lower than the actual number of function evaluations which keep the weight of the prior high. I am wondering if this is the desired behavior or if for this class, self._iteration_number should be equal to the number of evaluation rather than the number of time the acquisition function was used.
AF evaluations (self._iteration_number=1)
X configurations proposed
AF evaluations (self._iteration_number=1+X)
Y configurations proposed
AF evaluations (self._iteration_number=1+X+Y)
Actual Results
AF evaluations (self._iteration_number=1)
X configurations proposed
AF evaluations (self._iteration_number=2)
Y configurations proposed
AF evaluations (self._iteration_number=3)
Versions
Version 2.0.2
The text was updated successfully, but these errors were encountered:
Description
I am using PriorAcquisitionFunction. Based on https://arxiv.org/pdf/2204.11051.pdf, the weight of the user-defined prior is decreased exponentially after each iteration/ optimization step. I noticed that the variable self._iteration_number in class PriorAcquisitionFunction is updated each time the acquisition function is used. However, because the acquisition function is not used at every step (multiple configurations are proposed after evaluating the acquisition function), self._iteration_number is lower than the actual number of function evaluations which keep the weight of the prior high. I am wondering if this is the desired behavior or if for this class, self._iteration_number should be equal to the number of evaluation rather than the number of time the acquisition function was used.
Steps/Code to Reproduce
Expected Results (?)
AF evaluations (self._iteration_number=1)
X configurations proposed
AF evaluations (self._iteration_number=1+X)
Y configurations proposed
AF evaluations (self._iteration_number=1+X+Y)
Actual Results
AF evaluations (self._iteration_number=1)
X configurations proposed
AF evaluations (self._iteration_number=2)
Y configurations proposed
AF evaluations (self._iteration_number=3)
Versions
Version 2.0.2
The text was updated successfully, but these errors were encountered: