Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.0.0 Todo list #79

Open
13 of 26 tasks
denizyuret opened this issue Aug 16, 2018 · 5 comments
Open
13 of 26 tasks

1.0.0 Todo list #79

denizyuret opened this issue Aug 16, 2018 · 5 comments
Assignees
Labels

Comments

@denizyuret
Copy link
Owner

denizyuret commented Aug 16, 2018

  • broadcast of user defined functions not supported: grad doesn't work on trivial identity #101
  • Solve outstanding bugs and issues.
  • Review and merge pull requests. missing Jacobian and Hessian methods #54 [wip] hessian, jacobian and their vector products #57
  • Unit testing and more gradients in base.jl.
  • Unit testing for cat.jl.
  • Unit testing for iterate.jl.
  • Unit testing for linearalgebra.jl.
  • Compare all test files with src files to check for completeness.
  • Activate codecov.
  • Add missing derivatives, check out DiffRules.jl. integrate DiffRules #51
  • Overriding broadcasted vs broadcast? Measure memory and speed. Figure out Knet functions vs AutoGrad functions. What methods are defined?
  • scan code, finish todos and optimize
  • minimize function creation: go from f(Grad{n}) to back(f,n,...), recorder(f)(x) to forw(f,x...)?
  • speed up tests by reducing compilation during gradcheck.
  • try memoization on tape
  • fix scripts under prof/ and speed test.
  • test highorder: Innes has a PR?
  • fix docs and comments and examples
  • optimize sum_outgrads, reduce memory use through memoization and more UngetIndex.
  • ::Rec .^ ::Int does not work! broadcast error for integer power #80
  • tracked array interface
  • Fix the documentation so core.jl documentation can be seen by Docutils.
  • Transfer to KnetML.
  • Clear outgrad after for loop in backward_pass to save memory.
  • Missing / broken linalg gradients.
  • Figure out a better way to specify test ranges for functions.
@CarloLucibello
Copy link
Collaborator

Can we tag a new version of autograd and knet now so that people can start using them on julia 0.7/1.0?

@denizyuret
Copy link
Owner Author

denizyuret commented Aug 16, 2018 via email

@CarloLucibello
Copy link
Collaborator

tests are failing on current master

base: Error During Test at /home/carlo/.julia/dev/AutoGrad/test/base.jl:66
  Test threw exception MethodError(convert, (Rec{Float64}, 1.0), 0x0000000000006cd8)
  Expression: gradcheck(big, x1d[1])
  MethodError: Cannot `convert` an object of type BigFloat to an object of type Rec{Float64}
  Closest candidates are:
    convert(::Type{T}, !Matched::T) where T at essentials.jl:154
    Rec{Float64}(::Any, !Matched::Any, !Matched::Any, !Matched::Any, !Matched::Any, !Matched::Any) where T at /home/carlo/.julia/dev/AutoGrad/src/core.jl:318
  Stacktrace:
   [1] oftype(::Rec{Float64}, ::BigFloat) at ./essentials.jl:323
   [2] big(::Type{Grad{1}}, ::BigFloat, ::Rec{BigFloat}, ::Rec{Float64}) at ./none:0
   [3] backward_pass(::Rec{Float64}, ::Rec{BigFloat}, ::Array{AutoGrad.Node,1}) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:250
   [4] (::getfield(AutoGrad, Symbol("##gradfun#1#2")){typeof(big),Int64})(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Float64) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:41
   [5] (::getfield(AutoGrad, Symbol("#gradfun#3")){getfield(AutoGrad, Symbol("##gradfun#1#2")){typeof(big),Int64}})(::Float64) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:39
   [6] #gradcheck#3(::Array{Any,1}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Float64) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:41
   [7] gradcheck(::Function, ::Float64) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:38
   [8] macro expansion at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v0.7/Test/src/Test.jl:1079 [inlined]
   [9] macro expansion at /home/carlo/.julia/dev/AutoGrad/test/base.jl:25 [inlined]
   [10] macro expansion at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v0.7/Test/src/Test.jl:1079 [inlined]
   [11] top-level scope at /home/carlo/.julia/dev/AutoGrad/test/base.jl:6
base: Error During Test at /home/carlo/.julia/dev/AutoGrad/test/base.jl:73
  Test threw exception MethodError(setindex!, ((1.0, 2.0), 0.9999969722727738, 1), 0x0000000000006cd8)
  Expression: gradcheckN(prod, x1d)
  MethodError: no method matching setindex!(::Tuple{Float64,Float64}, ::Float64, ::Int64)
  Stacktrace:
   [1] #gc_array#6(::Int64, ::Int64, ::Array{Any,1}, ::Int64, ::Int64, ::Int64, ::Bool, ::Function, ::Tuple{Float64,Float64}, ::Tuple{Float64,Float64}, ::typeof(applyN), ::Array{Any,1}, ::typeof(prod)) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:108
   [2] (::getfield(Main, Symbol("#kw##gc_array")))(::NamedTuple{(:kwargs,),Tuple{Array{Any,1}}}, ::typeof(gc_array), ::Tuple{Float64,Float64}, ::Tuple{Float64,Float64}, ::Function, ::Array{Any,1}, ::Function) at ./none:0
   [3] #gc_index#5(::Base.Iterators.Pairs{Symbol,Array{Any,1},Tuple{Symbol},NamedTuple{(:kwargs,),Tuple{Array{Any,1}}}}, ::Function, ::Array{Any,1}, ::Array{Any,1}, ::Int64, ::Function, ::Array{Any,1}, ::Function) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:76
   [4] (::getfield(Main, Symbol("#kw##gc_index")))(::NamedTuple{(:kwargs,),Tuple{Array{Any,1}}}, ::typeof(gc_index), ::Array{Any,1}, ::Array{Any,1}, ::Int64, ::Function, ::Array{Any,1}, ::Function) at ./none:0
   [5] #gradcheck#3(::Array{Any,1}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Array{Any,1}, ::Function) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:50
   [6] gradcheck at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:38 [inlined]
   [7] #gradcheckN#12(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Tuple{Float64,Float64}) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:188
   [8] gradcheckN(::Function, ::Tuple{Float64,Float64}) at /home/carlo/.julia/dev/AutoGrad/test/gradcheck.jl:188
   [9] macro expansion at /home/carlo/.julia/dev/AutoGrad/test/base.jl:31 [inlined]
   [10] macro expansion at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v0.7/Test/src/Test.jl:1079 [inlined]
   [11] macro expansion at /home/carlo/.julia/dev/AutoGrad/test/base.jl:25 [inlined]
   [12] macro expansion at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v0.7/Test/src/Test.jl:1079 [inlined]
   [13] top-level scope at /home/carlo/.julia/dev/AutoGrad/test/base.jl:6
Test Summary: | Pass  Fail  Error  Total
base          |   45     1      2     48
  product     |    7                   7
  division    |    6                   6
  plus/minus  |   16                  16
  power       |    3     1             4
ERROR: LoadError: LoadError: Some tests did not pass: 45 passed, 1 failed, 2 errored, 0 broken.
in expression starting at /home/carlo/.julia/dev/AutoGrad/test/base.jl:4
in expression starting at /home/carlo/.julia/dev/AutoGrad/test/runtests.jl:10
ERROR: Package AutoGrad errored during testing

@denizyuret
Copy link
Owner Author

@ekinakyurek I got the memory saving hack working in the last master (you have to not delete tape[1].outgrad only). See if there is any improvement in macnet performance...

@ekinakyurek
Copy link
Collaborator

ekinakyurek commented Aug 24, 2018

Thank you. I will check it when mac-net fully compatible with Julia 1.0. We'll see how valuable this single line change is :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants