New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sparsity penalties for unsupervised learning #60
Comments
Something like a regularizer that could be attached to a layer, similar to Yangqing On Mon, Jan 27, 2014 at 8:46 AM, aravindhm notifications@github.com wrote:
|
@aravindhm, I found that you have already implemented an L1 norm layer in you own branch. Would you please take a look at my implementation (#113) following the advice of @Yangqing and tell me whether we are solving the same problem? As far as I can see, your contribution is relatively independent of mine and is well worth being merged back into the master branch here. There are a large number of public or private forks of Caffe out there. I had a look at some of the branches that are updated recently. Authors are working on various problems very actively. A diverse community will certainly accelerate the evolution of this project. It is a very healthy phenomenon. But at the same time, I hope there are as few duplicate efforts as possible. I would like the project owners, contributors and everyone who cares about this project to discuss the issue and find out a solution. |
My branch has made too many modifications. All the changes were made to the boost-eigen (I couldn't buy MKL) branch and a few were made to the master. Some of these are still broken. They include
Since the commits for these are interleaved, it makes merging very tough. Can merging be done on a file to file basis? |
If a dev branch is made I can copy atleast the tanh layer into it and have that merged without disturbing other branches? |
Merging can be done commit-by-commit through cherry-picking, and with interactive rebasing–see github help topic and git book chapter–anything is possible. To start, branch from whatever branch has all your intermingled work, and then you can sift out the desired changes from there. For instance, you could create a Rebasing is how I have been integrating Hope these tips help. |
It seems that @aravindhm has solved the problem in #116. We don't have to write a step by step guide by ourselves. A how to contribute doc with links to the most helpful external guides or tutorials is enough. |
I didn't cherry pick this time. I created a new local copy of master and made a branch of master (tanh). I copied the files in manually - very small effort in this case and sent a pull request. |
@aravindhm, your branch has a lot more good features and I hope they will be picked out and merged back too if you would like to. If they are mixed together in the commits, copying each of them separately is perhaps the only way to go. Any method that works is the best since we don't have to be bound by the tools. |
Sparsity penalties are addressed by #113. |
Is there an easy way to implement L1 regularization on the weight matrix of a fully connected network. Similarly I want to penalize the L1 norm of features in each layer. What is the best way to do that using caffe?
The text was updated successfully, but these errors were encountered: