Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Interop between DiffSharp and MathNet and FSharp Stats For optimization Problems #382

Open
AndyAbok opened this issue Sep 16, 2021 · 3 comments

Comments

@AndyAbok
Copy link

AndyAbok commented Sep 16, 2021

if diffsharp could support interop between mathnet mostly for the the optimization module so as to take advantage of automatic differentiation capability for optimization problems.


#r "nuget: DiffSharp-lite,1.0.0-preview-781810429"
#r "nuget: MathNet.Numerics.FSharp"

open System
open MathNet.Numerics
open MathNet.Numerics.Optimization
open MathNet.Numerics.LinearAlgebra
open DiffSharp

let rosenbrockFunction (xs: Vector<float>) =
    let x, y = xs.[0], xs.[1]
    pown (1.0 - x) 2 + 100.0 * pown (y - pown x 2) 2

let gradient(x:Vector<float>) = 
    //redefine this funcion to consume tensor
    let rosenbrockGrad (x:Tensor) = 
        let x, y = x.[0], x.[1]
        pown (1.0 - x) 2 + 100.0 * pown (y - pown x 2) 2

    let toTensor = dsharp.tensor(x)
    let diffObjectiveFun = dsharp.grad rosenbrockGrad toTensor 
    //given it outputs a Tensor we can convert it to a vector given that's the input mathnet would need
    diffObjectiveFun |> vector


let LBFGS f grad initialVal =
    let objectiveFunction =
        new System.Func<Vector<float>,float>(f)
    
    let gradientFunction =
        new System.Func<Vector<float>,Vector<float>>(grad)
    
    let obj = ObjectiveFunction.Gradient(objectiveFunction,gradientFunction) 
    let solver =  LimitedMemoryBfgsMinimizer(1e-5, 1e-5, 1e-5, 5, 1000)   
    let result = solver.FindMinimum(obj,initialVal)
    result.MinimizingPoint   
    |> Vector.toSeq

let initialVal = [|0.0;0.0|] |> vector

let res = LBFGS rosenbrockFunction gradient initialVal'''

I think this would apply to Fsharp stats as well, as it also does use vectors in such kind of problems.

@gbaydin
Copy link
Member

gbaydin commented Sep 16, 2021

Hi @AndyAbok thanks for sharing this idea with some example code. Really helpful to understand how this type of scenario can work.

I used the code you shared and the following revision of it works for me:

[<AutoOpen>]
module Converter = 
    let vectorToTensor (x:Vector<float>) = x.AsArray() |> dsharp.tensor
    let tensorToVector (x:Tensor) = x.toArray1D() |> vector
    let tensorToScalar (x:Tensor) = x |> float


let rosenbrock (xs: Tensor) =
    let x, y = xs.[0], xs.[1]
    pown (1.0 - x) 2 + 100.0 * pown (y - pown x 2) 2

// Vector<float> -> float
let rosenbrockV = vectorToTensor >> rosenbrock >> tensorToScalar
// Vector<float> -> Vector<float>
let rosenbrockGradV = vectorToTensor >> dsharp.grad rosenbrock >> tensorToVector


let LBFGS f grad initialVal =
    let objectiveFunction =
        new System.Func<Vector<float>,float>(f)
    
    let gradientFunction =
        new System.Func<Vector<float>,Vector<float>>(grad)
    
    let obj = ObjectiveFunction.Gradient(objectiveFunction,gradientFunction) 
    let solver =  LimitedMemoryBfgsMinimizer(1e-5, 1e-5, 1e-5, 5, 1000)   
    let result = solver.FindMinimum(obj,initialVal)
    result.MinimizingPoint   
    |> Vector.toSeq

let initialVal = [|0.0;0.0|] |> vector
printfn "%A" initialVal

let res = LBFGS rosenbrockV rosenbrockGradV initialVal
printfn "%A" res

this seems to work and print

seq [0.0; 0.0]
[|0.9999999802; 0.9999999893|]

The rosenbrock and LBFGS functions are exactly the same code you shared. The only modification needed was to make the argument of rosenbrok a DiffSharp Tensor instead of a Math.NET Vector. Then we can get the gradient function automatically using dsharp.grad.

So what seems to be needed is a lightweight conversion API as implemented by the Converter module above. Perhaps one could make a nice interoperability layer between DiffSharp and MathNet!

(Extra note: this was a quick experiment and there are more efficient ways of getting the function's output and gradient. What is implemented above runs the rosenbrock function twice, once in rosenbrockV and once in rosenbrockGradV. There is an operation called dsharp.fgrad that can give the function's value and the function's gradient's value in a single run, and this could be more efficient.)

@AndyAbok
Copy link
Author

@gbaydin Thanks for taking time to look at it.
The API would indeed be helpful and create interoperability easier.

I was testing this and it seems not to work on my end I get
The type 'Tensor' does not define the field, constructor or member 'toArray1D'. Maybe you want one of the following: toArray
when trying to run the conversion function. What could I be missing?
I'll try experiment this on different examples and observe how it plays out as well.

@gbaydin
Copy link
Member

gbaydin commented Sep 22, 2021

Hi @AndyAbok it seems like your example was using a package version that is slightly older than the latest. If you use

#r "nuget: DiffSharp-cpu,1.0.0-preview-1239345497"

the code above should work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants