Skip to content

Conversation

@yogeshg
Copy link
Contributor

@yogeshg yogeshg commented Apr 11, 2018

  • check if un-tunable parameters need to be defined
  • check if tunable parameters need to be defined or specified in param map (this check against all those given to fitMultiple)
  • add test to see if method works correctly

@codecov-io
Copy link

codecov-io commented Apr 12, 2018

Codecov Report

Merging #116 into master will increase coverage by 0.31%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #116      +/-   ##
==========================================
+ Coverage   83.45%   83.76%   +0.31%     
==========================================
  Files          34       34              
  Lines        1994     1989       -5     
  Branches       44       44              
==========================================
+ Hits         1664     1666       +2     
+ Misses        330      323       -7
Impacted Files Coverage Δ
python/sparkdl/param/shared_params.py 86.4% <ø> (+1.08%) ⬆️
python/sparkdl/param/image_params.py 79.41% <ø> (+1.91%) ⬆️
python/sparkdl/param/__init__.py 100% <ø> (ø) ⬆️
...n/sparkdl/estimators/keras_image_file_estimator.py 88.88% <100%> (+3.17%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 841f905...84c7855. Read the comment docs.

@sueann sueann self-requested a review April 13, 2018 21:46
shutil.rmtree(self.temp_dir, ignore_errors=True)

def test_validate_params(self):
"""Test that `KerasImageFileEstimator._validateParams` method works as expected"""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we probably could be more thorough in making sure all the intersect/diff logic works but I think it's good enough.

Copy link
Collaborator

@sueann sueann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm. just a small comment

raise ValueError("Output column must be defined")
if self.inputCol in paramMap:
raise ValueError("Input column can not be fine tuned")
model_params = [self.kerasOptimizer, self.kerasLoss, self.kerasFitParams]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

merge model_params and output_params into tunable_params for clarity

@yogeshg
Copy link
Contributor Author

yogeshg commented Apr 13, 2018

Offline review with @smurching , add error for untunable param.

raised raised #117 in case this becomes a blocker and we are good with merging as is.

@sueann sueann merged commit 2a63d4f into databricks:master Apr 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants