Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to get all parameters inside a step. #137

Closed
valayDave opened this issue Feb 21, 2020 · 4 comments
Closed

How to get all parameters inside a step. #137

valayDave opened this issue Feb 21, 2020 · 4 comments
Assignees

Comments

@valayDave
Copy link

@valayDave valayDave commented Feb 21, 2020

I am trying to get all the parameters I have declared in my step but I am not able to with the _get_parameters() method for the FlowSpec object.

# Use the specified version of python for this flow.
@conda_base(python=get_python_version(),libraries={'pytorch':'1.4.0'})
class ParameterDebugFlow(FlowSpec):
    
    some_data = Parameter('some_data',help='Mock Data',default=128)

    @step
    def start(self):
        for var, param in self._get_parameters():
            print(param)

        self.next(self.end)

    @step
    def end(self):
        print("Flow Endeded")


if __name__ == '__main__':
    ParameterDebugFlow()

Is this intended functionality? If so how do I retrieve the Parameters I have declared in my class? I am porting a DL project which is fairly well parameterized using argparse for GPU based instance training. I wanted to move the project to metaflow for performing broader hyperparameter search.

@romain-intel romain-intel self-assigned this Feb 23, 2020
@romain-intel

This comment has been minimized.

Copy link
Contributor

@romain-intel romain-intel commented Feb 23, 2020

The _get_parameters function is meant to be used internally and won't return what you expect because, by the time you call it in your step, we have replaced all parameters with a read-only property (look for _init_parameters in task.py).

What is your specific need for needing to know the names of the parameters (you can access the values simply using self.)?

@valayDave

This comment has been minimized.

Copy link
Author

@valayDave valayDave commented Feb 24, 2020

I was moving a DL Project into Metaflow for a broader hyperparameter search. I have more than 15+ Parameters in the experiment. Dynamic extraction of those parameters can be useful when taking previous flow runs and comparing the best model parameters from the previous runs.

When you are experimenting with DL you have varying number of parameters many of the times. And so after testing with lots of flows with varying params, it would be a very useful utility to be able to extract them after each run or even during a run.

@romain-intel

This comment has been minimized.

Copy link
Contributor

@romain-intel romain-intel commented Feb 25, 2020

Got it, so you effectively want a way to iterate over the names of the parameters, something like:

for param_name in self.parameters():
    print("Parameter %s has value %s" % (param_name, getattr(self, param_name)))

Is this correct?

@valayDave

This comment has been minimized.

Copy link
Author

@valayDave valayDave commented Feb 25, 2020

Yes. Exactly Like that.

romain-intel pushed a commit that referenced this issue Mar 6, 2020
Romain Cledat
1. Use a smaller standalone Conda installer for AWS Batch
2. Add METAFLOW_S3_ENDPOINT_URL configuration (#130)
3. Use the CLI datastore-root before checking for METAFLOW_DATASTORE_SYSROOT_S3
4. Fix an issue where using the local metadata provider with Batch resulted
   in .metaflow/.metaflow instead of just .metaflow
5. Add a way to get parameter names passed to a flow (using
   current.parameter_names) (#137)
6. Properly indent on show (#92)
7. Surpress superfluous message when running on Batch
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants
You can’t perform that action at this time.