forked from yechengxi/LightNet
-
Notifications
You must be signed in to change notification settings - Fork 1
/
Log.txt
50 lines (49 loc) · 1.72 KB
/
Log.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
20160429
1. Implemented optimizers: SGD, Adagrad, RMSProp, Adam.
2. LSTM implemented.
20160501
1. Q-network implemented.
20160505
1. Intermediate network parameters are saved by modifying the TrainScript.
20160509
1. Updated LSTM training.
2. Experiments are now separated in different folders. A RunAll.m script is provided to run all experiments at once.
20160511
1.Simplified implementation of loss functions.
20160517
1.A small fix in the tanh function.
2.A small fix in the train_lstm function to make it compatible with older matlab release.
20160528
1. Pretrained ImageNet models are supported.
2. Local response normalization added.
3. Batch normalization added.
20160604
1. A small fix in the RNN training procedure.
20160606
1. A small fix in the RNN training procedure to make it compatible with older matlab Matlab versions.
2. A severe bug in the convolutional layer introduced in the 20160528 version is fixed.
20160609
1. Convolutions using CUDNN is supported.
20160610
1. A small fix in the LSTM loss.
20160624
1. A small fix in the LSTM bp process.
2. A small fix in the bnorm function.
20160707
1. A fix in the tanh_ln implementation.
2. A fix in the LSTM inplementation.
20160920
1. Dropout layer added.
20161010
1. RNN network with skip links added in the RNN folder.
2. Gated Recurrent Unit added in the RNN folder.
20170217
1. CUDNN is better supported by using Neural Network Toolbox from Mathworks. (conv_layer, maxpool)
20170219
1. CUDNN is enabled in the linear layer computation (MLP network).
20170224
1. CUDNN is enabled in lrn.
20170328
1. Thanks to the help from Mathworks (J.H. Wang) we have used the implicit expansion feature (introduced in Matlab R2016b) to replace bsxfun in LightNet.
20170411
1. 1d convolution layer added.