Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upThe most important thing missing in documentation #57
Comments
|
You |
@treysp is correct but it is true that pushing the data that comes out of the recipe back into a formula is a bit underwhelming. I'm in the process of making a recipe method for |
|
Cool. It is not clear that Sorry for nitpicking. Hope the feedback is useful. |
Absolutely!
We're low on verbs to use I would say that process is the last step prior to model fitting (hopefully). The "pre-" is really pre-model fitting.
The idea is that preprocessing of the data can be part of the overall modeling process and some of the steps involve estimating parameters in was that are similar to a classical model. For example, The terminology is more similar to the machine learning communities where they would need to estimate/train/learn a model such as an autoencoder prior to fitting a neural network. |
|
Maybe using verbs that imply an ordering would clarify the process for people outside machine learning communities (like me). Although one could take it too far, what about leveraging the recipe metaphor? Maybe something like:
Not sure how hokey that sounds |
|
|
|
What about |
|
I kind of like |
So I have the recipe. What should I do next? How to pass the recipe to
lmfor example?