Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow dynamic batch sizes in all the layers #195

Closed
wants to merge 13 commits into from
Closed

Allow dynamic batch sizes in all the layers #195

wants to merge 13 commits into from

Conversation

kloudkl
Copy link
Contributor

@kloudkl kloudkl commented Mar 9, 2014

In practical applications, many users need to feed data of varying batch sizes into the network. The detailed design of this PR was formed in the discussion of #119.
#189 may involved similar but more complex blob memory operations, but this PR only focuses on batch size.

Many but not all the layers have passed the dynamic batch sizes test cases right now.

@@ -18,8 +18,10 @@ class Blob {
explicit Blob(const int num, const int channels, const int height,
const int width);
virtual ~Blob() {}
void Reshape(const int num, const int height,
const int width, const int channels);
void Reshape(const int num, const int channels, const int height,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for making this consistent with the internal indexing by N x K x H x W.

@shelhamer
Copy link
Member

This is welcome flexibility! Please update us when tests pass so that we can assign a reviewer. Thanks.

void Reshape(const int num, const int channels, const int height,
const int width);
void ReshapeBigEnough(const int num, const int channels, const int height,
const int width);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add comments for this function?

@kloudkl
Copy link
Contributor Author

kloudkl commented Mar 25, 2014

This needs to be revised to be consistent with #250.

@kloudkl
Copy link
Contributor Author

kloudkl commented Jun 10, 2014

Too many things have changed since the initial commits. There is almost no difference between rebasing and rewriting. Closing this to relaunch the efforts based on #355 (#250) and #479.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants