Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support multiple batch dimensions in Dense layer #1405

Merged
merged 5 commits into from
Nov 30, 2020
Merged

support multiple batch dimensions in Dense layer #1405

merged 5 commits into from
Nov 30, 2020

Conversation

CarloLucibello
Copy link
Member

@CarloLucibello CarloLucibello commented Nov 25, 2020

Since most deep learning frameworks support it, we also should.

I can't find a corresponding issue. #282 is slightly related.
After this, we should close #708

PR Checklist

  • Tests are added
  • Entry in NEWS.md
  • Documentation, if applicable
  • Final review from @dhairyagandhi96 (for API changes).

@DhairyaLGandhi
Copy link
Member

Is this doing the correct thing here?

@CarloLucibello
Copy link
Member Author

what do you mean? it does what is meant to do. Pytorch does the same thing.
Notice that this is not #282 or #856

@mcabbott
Copy link
Member

mcabbott commented Nov 25, 2020

+1, this seems like the obvious behaviour.

I'd be tempted to make the existing method x::AbstractVecOrMat and then have x::AbstractArray which reshapes & calls that. But not a huge deal.

@CarloLucibello
Copy link
Member Author

I'd be tempted to make the existing method x::AbstractVecOrMat and then have x::AbstractArray which reshapes & calls that. But not a huge deal.

yeah, I didn't do it because the performance impact is negligible in any case, but we could. Anyway, as you said not a big deal

src/layers/basic.jl Outdated Show resolved Hide resolved
src/layers/basic.jl Outdated Show resolved Hide resolved
@DhairyaLGandhi
Copy link
Member

bors r+

bors bot added a commit that referenced this pull request Nov 30, 2020
1405: support multiple batch dimensions in Dense layer r=DhairyaLGandhi a=CarloLucibello

Since most deep learning frameworks support it, we also should.

I can't find a corresponding issue. #282 is slightly related. 
After this, we should close #708 

### PR Checklist

- [x] Tests are added
- [x] Entry in NEWS.md
- [x] Documentation, if applicable
- [ ] Final review from `@dhairyagandhi96` (for API changes).


Co-authored-by: Carlo Lucibello <carlo.lucibello@gmail.com>
Co-authored-by: Dhairya Gandhi <dhairya@juliacomputing.com>
@CarloLucibello
Copy link
Member Author

If we want to strictly adhere to ColPrac (and maybe we don't ), you should just approve Prs from collaborators with merge rights, then they merge by themselves:

Reviewing, Approving and Merging PRs

  • PRs must have 1 approval before they are merged.
  • PR authors should not approve their own PRs.
  • PRs should pass CI tests before being merged.
  • PRs by people without merge rights must have approval from someone who has merge rights (who will usually then merge the PR).
  • PRs by people with merge rights must have approval from someone else, who may or may not have merge rights (and then may merge their own PR).
  • PRs by people with merge rights should not be merged by people other than the author (just approved).

@DhairyaLGandhi
Copy link
Member

you should be able to use bors like usual

@CarloLucibello
Copy link
Member Author

you should be able to use bors like usual

sure, but I think that you need to hit Approve in Github's review (according to ColPrac)

@DhairyaLGandhi
Copy link
Member

bors r+

@bors
Copy link
Contributor

bors bot commented Nov 30, 2020

Build succeeded:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Plans for supporting higher dimensional data?
3 participants