New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add hyper-parameter tuner interface #386
Conversation
import com.linkedin.photon.ml.hyperparameter.search.{GaussianProcessSearch, RandomSearch} | ||
|
||
/** | ||
* A hyper-parameter tuner which depends on an internal library. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's make it more explicit and say internal LinkedIn library. Since internal might mean within photon-ml.
@@ -0,0 +1,72 @@ | |||
/* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bit unclear on this part. So the AtlasTuner will move out of photon-ml correct? and photon-ml will only contain the DummyTuner?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will change it. Thank you!
import com.linkedin.photon.ml.cli.game.training.GameTrainingDriver.{getOrDefault, hyperParameterTuning, hyperParameterTuningIter} | ||
import com.linkedin.photon.ml.estimators.{GameEstimator, GameEstimatorEvaluationFunction} | ||
import com.linkedin.photon.ml.estimators.GameEstimator.GameResult | ||
import com.linkedin.photon.ml.hyperparameter.search.{GaussianProcessSearch, RandomSearch} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Related to the above comment, in that setup, we will move out all functions from hyperparameter correct?
@@ -0,0 +1,23 @@ | |||
/* | |||
* Copyright 2017 LinkedIn Corp. All rights reserved. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make this 2018
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will change it. Thanks.
@@ -0,0 +1,63 @@ | |||
/* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So this will now move out of photon-ml into atlas-ml and depend on photon-ml for the trait correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes
import com.linkedin.photon.ml.evaluation.Evaluator | ||
import com.linkedin.photon.ml.hyperparameter.EvaluationFunction | ||
import com.linkedin.photon.ml.hyperparameter.search.{GaussianProcessSearch, RandomSearch} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just to confirm, the lines 19-22 will stay in photon-ml as traits and 23 will move to atlas-ml. Correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. And the evaluator will be removed later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM now. @joshvfleming Please a look and then we can move to Atlas.
def apply[T](tunerName: HyperparameterTunerName): HyperparameterTuner[T] = { | ||
tunerName match { | ||
case DUMMY => new DummyTuner[T] | ||
case ATLAS => new AtlasTuner[T] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is still a dependency issue here...
Let change as per the offline discussion. |
62549b3
to
64fb466
Compare
object HyperparameterTunerFactory { | ||
|
||
val DUMMY_TUNER = "com.linkedin.photon.ml.hyperparameter.tuner.DummyTuner" | ||
val ATLAS_TUNER = "com.linkedin.photon.ml.hyperparameter.tuner.AtlasTuner" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are both of these moving to Atlas? In either case, let's change the string to " com.linkedin.atlas.tuner.AtlasTuner"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since it is not moved yet, the test with com.linkedin.atlas.tuner.AtlasTuner will report an error message. Please see the test result with invalid tuner in li-photon-plugin.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's put a comment here with a brief explanation of why it's arranged this way, and that the atlas-specific code will be removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
object HyperparameterTunerFactory { | ||
|
||
val DUMMY_TUNER = "com.linkedin.photon.ml.hyperparameter.tuner.DummyTuner" | ||
val ATLAS_TUNER = "com.linkedin.photon.ml.hyperparameter.tuner.AtlasTuner" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's put a comment here with a brief explanation of why it's arranged this way, and that the atlas-specific code will be removed.
// TODO: Move AtlasTuner into atlas-ml for the auto-tuning system migration: | ||
// TODO: val ATLAS_TUNER = "com.linkedin.atlas.ml.hyperparameter.tuner.AtlasTuner". | ||
// TODO: Temporarily stay in photon-ml for test purpose. | ||
val ATLAS_TUNER = "com.linkedin.photon.ml.hyperparameter.tuner.AtlasTuner" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comments added here @basukinjal @joshvfleming
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. @joshvfleming Lets merge this.
@cmjiang The test failure here appears to be legitimate. I can reproduce it locally:
|
I don't like that there is both a tuner name and a tuning mode now - one is a subtype of the other, really. The end goal is that we want to do either no tuning, random tuning, or Bayesian tuning. Now this is split over two inputs, many combinations of which are invalid ( Why even have a "dummy" tuner? Why not have an |
@ashelkovnykov The idea is that rather than an if/else branch in the driver code, the factory would default to a noop tuner in the case of no tuning. That way, the logic of deciding what to do is contained in the factory instead of the driver. But -- we'll probably need to check whether a parameter has been passed for this in the first place, so perhaps branching on Some/None is okay. |
@ashelkovnykov @joshvfleming I think this is being done just so that photon-ml does not need to depend on atlas-ml in the short term. Once we make atlas-ml open sourced, this branching can be removed as photon-ml will directly depend on atlas-ml and the driver will work only with None, Random or Bayesian. We can then maybe remove this tunerFactory. |
@basukinjal @ashelkovnykov Yeah. The whole point of this is that it's an intermediate step to create a functioning separation between the two codebases. It might be ugly to have the interface in place, but that's better than having the entire atlas codebase embedded here as it is now. |
@@ -363,7 +363,8 @@ class GameTrainingDriverIntegTest extends SparkTestUtils with GameTestUtils with | |||
val newArgs = mixedEffectSeriousRunArgs | |||
.copy | |||
.put(GameTrainingDriver.rootOutputDirectory, outputDir) | |||
.put(GameTrainingDriver.outputMode, ModelOutputMode.TUNED) | |||
.put(GameTrainingDriver.outputMode, ModelOutputMode.ALL) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here we must change the outputMode
to ModelOutputMode.ALL
. If we use ModelOutputMode.TUNED
, we don't have any tuned model with a dummy tuner. Then it returns an empty output model which causes a failure at line 377 below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually I am wondering if we can remove this testHyperParameterTuning
in GameTrainingDriverIntegTest
since we won't have atlas-ml in photon-ml, all the following lines in this test become non-sense for testing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably best to comment it out, since once Atlas is open-sourced the test will be relevant again.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@cmjiang Great find! Let's get the ball rolling on atlas now. This LGTM. @joshvfleming Let's merge!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Commented the test out and put a comment on this change.
@joshvfleming @basukinjal @yunboouyang
This PR, adding a hyper-parameter tuner, has no conflicts with PR#385. After this PR and PR#385 are both merged, let's merge PR#377 rebased on them to completely decouple the auto-tuning system.