Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mir.sparse #79

Merged
merged 25 commits into from
Apr 23, 2016
Merged

mir.sparse #79

merged 25 commits into from
Apr 23, 2016

Conversation

9il
Copy link
Member

@9il 9il commented Apr 12, 2016

PR for tracking #43 implementation.

@codecov-io
Copy link

Current coverage is 96.55%

Merging #79 into master will decrease coverage by -2.84% as of 679d422

@@            master     #79   diff @@
======================================
  Files            7      18    +11
  Stmts         2643    3223   +580
  Branches         0       0       
  Methods          0       0       
======================================
+ Hit           2627    3112   +485
  Partial          0       0       
- Missed          16     111    +95

Review entire Coverage Diff as of 679d422


Uncovered Suggestions

  1. +0.50% via .../model/lda/hoffman.d#145...160
  2. +0.40% via .../model/lda/hoffman.d#165...177
  3. +0.25% via .../model/lda/hoffman.d#61...68
  4. See 7 more...

Powered by Codecov. Updated on successful CI builds.

@9il
Copy link
Member Author

9il commented Apr 13, 2016

@wilzbach will use common hash map :-) initial implementation pushed to this branch

import mir.ndslice.slice: sliced;
T[size_t] table;
table[0] = 0;
table.remove(0);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is that needed?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to initialize table with not null

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but isn't that just a language feature? Or why do you need not null?

int[int] a;
writeln(a.get(0, 2)); // 2

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

writeln(a.get(0, 2)); // 2

writeln(a) ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assert(a !is null); ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but isn't that just a language feature? Or why do you need not null?

For correct behavior when when coping

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you mind to add an unittest for the correct behavior? :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

@wilzbach
Copy link
Member

@wilzbach will use common hash map :-) initial implementation pushed to this branch

I really like the direction this is going and the seamless integration with Slice. I would add a couple of unittests that this more, e.g.

mySparse[] = mySlice;
mySlice[] = mySparse;

and some of the iteration and selection algorithms

@9il
Copy link
Member Author

9il commented Apr 13, 2016

no, please, let me finish it first. It is faster to implement this than to review api for a couple of small functions

@wilzbach
Copy link
Member

no, please, let me finish it first

toggle annoying_bot off

@9il
Copy link
Member Author

9il commented Apr 13, 2016

no, please, let me finish it first
toggle annoying_bot off

Have you start to collect article information for random distributions?

@9il 9il changed the title mir.ndsparse mir.ndslice.sparse Apr 15, 2016
@9il 9il force-pushed the sparse branch 4 times, most recently from a84bf47 to 1b9d2fd Compare April 18, 2016 15:06
@9il
Copy link
Member Author

9il commented Apr 18, 2016

It is almost covered and passes CI tests. I have added sparse_blas package and moved sum to the mir root.

}
}

private sizediff_t cmpCoo(size_t N)(const auto ref size_t[N] a, const auto ref size_t[N] b)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you only use this once (three lines above) - thus why not directly putting it there?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this may be used by the future algorithms

@wilzbach
Copy link
Member

It is almost covered and passes CI tests. I have added sparse_blas package and moved sum to the mir root.

Wow! Looks great on my first pass. I will have to play with to give further feedback (will do so tonight).

Nitpicks from the first pass:

  • a lot of doc is missing
  • your editor is using tabs? Please configure it to use spaces ;-)
  • saw a couple of places with debug code

Do you plan to support the nogc code via Allocators?
This probably requires a custom implementation of hashmaps?

moved sum to the mir root.

you probably have to update the index.d too

@9il
Copy link
Member Author

9il commented Apr 18, 2016

Do you plan to support the nogc code via Allocators?

Yes, but when it would be at phobos.

This probably requires a custom implementation of hashmaps?

Yes for Sparse and no for CompressedTensor.

@9il
Copy link
Member Author

9il commented Apr 18, 2016

i moved sparse and blas to separate package, so we would not include it to ndslice. I just added few ptr methods to ndslice.

@9il 9il changed the title mir.ndslice.sparse mir.sparse Apr 18, 2016

/++
This constructor should be used only for integration with other languages or libraries such as Julia and numpy.
Params:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for changing & documenting this :)

@9il 9il merged commit 34b238b into master Apr 23, 2016
@9il 9il deleted the sparse branch April 23, 2016 20:45

alias round = llvm_round;

alias fmuladd = llvm_fmuladd;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick - you combine white space and tabs in here :/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants