Skip to content

Commit

Permalink
Seminars (#6)
Browse files Browse the repository at this point in the history
* Initial commint - files upload

* Create README.md

* Create requirements.txt

* Update README.md

* Create README.md

* Update README.md

* Add example image

* Create README.md

* Update README.md
  • Loading branch information
kmkolasinski committed Aug 13, 2018
1 parent 8372f90 commit c156707
Show file tree
Hide file tree
Showing 41 changed files with 447,356 additions and 1 deletion.
14 changes: 13 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,14 @@
# deep-learning-notes
Experiments with Deep Learning
Experiments with Deep Learning and other resources:

* [keras-capsule-pooling](keras-capsule-pooling) - an after-hours experiment in which I try to implement Capsule pooling for images.
* [max-normed-optimizer](max-normed-optimizer) - an experimental implementation of an interesting gradient descent optimizer which normalizes gradients according to their norms. Contains various experiments which show potential power of this method.
* [selu-regularization](selu-regularization) - a Keras Regularizer Layer which allows for forcing SELU like regularization on the model weights (Dense and Conv2D versions are provided). Selu was introduced as an activation function with special initialization method, those regularizers can be add to force the weight to preserve self normalizing property during the training.
* [tf-oversampling](tf-oversampling) - example with how to implement oversampling with tf.data.Dataset API.

# Seminars on Deep Learning and Machine Learning

[Seminars](seminars) - contains a bunch of presentations I have gave at our company.



Binary file added seminars/2016-11-tSNE/slides.pdf
Binary file not shown.
Binary file added seminars/2016-11-tSNE/slides.pptx
Binary file not shown.
Binary file added seminars/2017-01-Word2Vec/slides.pdf
Binary file not shown.
Binary file added seminars/2017-01-Word2Vec/slides.pptx
Binary file not shown.
4 changes: 4 additions & 0 deletions seminars/2017-03-Introduction-to-DL/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# Abstract

A preliminary material on the basics of Deep Learning and Machine Learning.
Slides cover things like graphs, backpropagation, optimizers, regularization etc.
Binary file added seminars/2017-03-Introduction-to-DL/slides.pdf
Binary file not shown.
Binary file added seminars/2017-03-Introduction-to-DL/slides.pptx
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
from keras.regularizers import Regularizer
import keras.backend as K


class SeluRegularizer(Regularizer):

def __init__(self, mu=0.001, tau=0.001):
self.mu = K.cast_to_floatx(mu)
self.tau = K.cast_to_floatx(tau)

def __call__(self, x):

mean_loss = self.mu * K.mean(K.square(K.sum(x, axis=0)))
tau_loss = - self.tau * K.mean(K.log(K.sum(K.square(x), axis=0) + K.epsilon()))

return mean_loss + tau_loss

def get_config(self):
return {'mu': float(self.mu),
'tau': float(self.tau)}
Loading

0 comments on commit c156707

Please sign in to comment.