Skip to content

Conversation

@HugoGranstrom
Copy link
Member

@HugoGranstrom HugoGranstrom commented Nov 16, 2022

It's due time I at least write some tutorials for numericalnim's functionalities. Plus, it would allow me to clean up my massive README and just link it here 😎

  • ODE
    • Draft
    • Finished
  • Optimization
    • Draft
    • Finished
  • Interpolation
    • Draft
    • Finished
  • Curve fitting
    • Draft
    • Finished
  • Add parameter uncertainty and chi2 to numericalnim and use it in curve_fitting tutorial.
  • Finally setup docs CI for numericalnim and link it.

@HugoGranstrom
Copy link
Member Author

I've (finally) finished the drafts of 4 articles now. All feedback is appreciated :D

On the ODE article, I think I will remove all the comparisons between all the methods and instead just choose a few. It feels like it slows down the build time for no added value.

@Clonkk
Copy link
Member

Clonkk commented Jan 1, 2023

Nice, I'll see if I can take a look tomorrow or Wednesday

@Vindaar
Copy link
Member

Vindaar commented Jan 1, 2023

Yup, I will also give it a read soon!

@pietroppeter
Copy link
Contributor

Hi, nice work. Started reading too!

Remarks on curve fitting:

  • I would start with introducing the problem of curve fitting and later mention what you use to solve it (the nonlinear solver)
  • also introducing the example you could mention in general one does not know that which curve to fit but to simplify the example you will use the same curve to fit and to generate points
  • noted that randomTensor actually is not Gaussian noise but uniform from 0 to max

@pietroppeter
Copy link
Contributor

FOR ODE:

  • you do mention numerical method are used when you do not have exact solution but later you only solve odes with exact solutions. You could mention that you do like this, in order to be able to check the error
  • in the last benchy run the header are missing, not sure why...

@pietroppeter
Copy link
Contributor

For optimization:

  • you apply 3 methods, are all the methods available in numerical Nim? A link to their Wikipedia could be useful (at least to me, did not know about LFGBS)
  • on analytical gradient you mention it improves time but you do not check time when applying to LFGBS, you could run benchy for that too
  • also on analytical gradient: is LFGBS the only one to take it as an an option (I guess the other too)
  • option parameter probably should be backticked (tol)
  • is there a reference api page in numericalnim that could be linked? I guess this could be useful also for the other pages
  • you reference curve fitting. I imagine the reason is: you might want to minimize a function of which you only know some points. In that case you could use curve fitting. It could be helpful to mention the reason why you link curve fitting

@pietroppeter
Copy link
Contributor

On interpolation:

  • to check difference on 2d functions maybe you could plot the heat map of error? Comparing two heat maps is hard. Also it could be nice to plot the grid points
  • I guess like the other places it could help to have links to Wikipedia pages of the methods, also if there is a ref api page it could be linked
  • here I guess it would also be appropriate to link curve fitting? And maybe interpolation should be linked from optimization? I guess if you are confident about the parametric form of the function to fit you should use curve fitting, if you are not you could use interpolation (not sure though if it can be use in optimization, does it extrapolate also?)

@pietroppeter
Copy link
Contributor

All in all very nice work, I think the examples are all very good for showing up the libraries. Great job!

@HugoGranstrom
Copy link
Member Author

Hi, nice work. Started reading too!

Many thanks for the feedback 😄 Most of them have been fixed now.

noted that randomTensor actually is not Gaussian noise but uniform from 0 to max

Good catch! I opted for just using uniform noise instead, I couldn't find any builtin Gaussian noise constructor in arraymancer.

in the last benchy run the header are missing, not sure why...

I think benchy only puts the header on the first call in the file. Otherwise it would insert the header before every benchmark. It is controlled by a non-exported variable here so I don't think there is much we can do right now sadly.

you apply 3 methods, are all the methods available in numerical Nim? A link to their Wikipedia could be useful (at least to me, did not know about LFGBS)

I have added links to most of the methods (except ODE because that list is autogenerated). It's a good point that the reader might want to read up on things more themselves as well.

on analytical gradient you mention it improves time but you do not check time when applying to LFGBS, you could run benchy for that too

The problem is so small, so the difference isn't too big between numerical and analytical. It's basically 4 calls to f for the numerical one. I ran a benchmark now and got 3.3ms vs 3.1ms. It's when we start to add many more parameters that the difference should start to show. And I'm not tempted to add such an example here just to prove a point 😅

is there a reference api page in numericalnim that could be linked? I guess this could be useful also for the other pages

Not currently, but I should really get a nim doc documentation added to the CI. I tried locally now and it's so simple :o

to check difference on 2d functions maybe you could plot the heat map of error? Comparing two heat maps is hard. Also it could be nice to plot the grid points

Very good point, the heatmap turned out rather pretty as well :) You can really see the areas between the points with higher errors.

Copy link
Member

@Vindaar Vindaar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work! Only a bunch of minor typos etc. from my side!

Copy link
Member

@Vindaar Vindaar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work!

@HugoGranstrom
Copy link
Member Author

@pietroppeter

Missed this question:

not sure though if it [interpolation] can be use in optimization, does it extrapolate also?

It does not extrapolate, and I'm not sure if it had helped even if it did. Extrapolations are often either constant or linear and none of them are really great for optimization as the first has zero derivative and the second one goes on forever. What could make interpolation useful in optimization, though, is if we can add a bounding box to the optimization routine so that it stops when it reaches the edge and only walks along it instead of crossing it.

Also, I have set up docs for numericalnim finally and have added links to it in Further reading.

I think I have taken almost all feedback into consideration now. So I'll read through it a few times tomorrow and merge this in the evening unless I or someone else finds any important errors.

@HugoGranstrom HugoGranstrom merged commit 86b6e6b into main Jan 6, 2023
@HugoGranstrom
Copy link
Member Author

Here we go, finger crossed for no build errors! 🤞

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants