Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

resolve merge conflicts between crop-layer and BVLC:master #2

Closed
wants to merge 341 commits into from

Conversation

kashefy
Copy link

@kashefy kashefy commented Jul 22, 2015

@shelhamer Thanks for the useful features.

The merging of SPPLayer into BVLC:master causes a merge conflict between your crop-layer branch and BVLC:master. I suspect the branch is not ready to be merged into BVLC:master yet but it's already become relevant for trying out the FCN models.
So this PR just resolves the conflict by disentangling the SPPLayer and CropLayer class definitions in the vision_layer header.

The list of commits that github shows for this PR is a bit overwhelming. The commits for resolving the merge conflict can be seen here.

jeffdonahue and others added 30 commits March 9, 2015 12:45
Fixup AccuracyLayer like SoftmaxLossLayer in BVLC#1970
See BVLC model license details on the model zoo page.

[Re-commit of 6b84206 which somehow went missing.]
described in Kaiming He et al, "Delving Deep into Rectifiers: Surpassing
Human-Level Performance on ImageNet Classification", arxiv 2015.

Belows are commit message histories that I had while developing.

PReLULayer takes FillerParameter for init

PReLU testing consistency with ReLU

Fix : PReLU test concistency check

PReLU tests in-place computation, and it failed in GPU

Fix: PReLU in-place backward in GPU

PReLULayer called an incorrect API for copying
data (caffe_gpu_memcpy). First argment of `caffe_gpu_memcpy` should be
size of memory region in byte. I modified to use `caffe_copy` function.

Fix: style errors

Fix: number of axes of input blob must be >= 2

Use 1D blob, zero-D blob.

Rename: hw -> dim
Use ReadNetParamsFromBinaryFileOrDie to read a net param when restoring
from a saved solverstate, which upgrades old nets, rather than
ReadProtoFromBinaryFile.
Cross-channel LRN bounds checking for GPU implementation
There are no cases where Forward is called without Reshape, so we can
simplify the call structure.
Give cuDNN {0, 1} constants for controlling accumulation through the
alpha and beta coefficients.
2.7.0 isn't really necessary - 2.3.0 is sufficient. This is the version
available on Ubuntu 14.04 via apt-get, and seems to be a reasonable lowest
common denominator in general.

http://pillow.readthedocs.org/installation.html#old-versions
shelhamer and others added 28 commits June 30, 2015 15:37
bundle CVPR15 tutorial notebooks
This filler is a convenience for interpolating with DeconvolutionLayer
or smoothing + downsampling with ConvolutionLayer for stride > 1.
[docs] install boost without recommends to avoid conflicts -- close BVLC#2454
No functional changes, just fixing whitespace errors and typos in
comments
Making the net_spec python3 compatible
List protobuf-compiler dependency in the correct place (it is in the …
Also more readable and compatible with format of instructions for MNIST https://github.com/BVLC/caffe/tree/master/examples/mnist
Removes a unused variable warning
examples/imagenet: fix broken link
One command less in CIFAR10 documentation
tiny fix in Layer::Backward documentation
[build] Travis scripts for python3 and pytest for cmake.
@shelhamer
Copy link
Owner

Thanks for offering the merge but we have posted an updated longjon/caffe:future branch for use instead. Sorry it had fallen behind. An edition of the FCN helper layers for merge will be posted soon.

@shelhamer shelhamer closed this Aug 26, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet