Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AlexNet #15

Open
drtha opened this issue May 1, 2023 · 0 comments
Open

AlexNet #15

drtha opened this issue May 1, 2023 · 0 comments

Comments

@drtha
Copy link

drtha commented May 1, 2023

Hi,
I was running example 10.4. Unfortunately I got bad results (see below) and I don't know why. Probably I did something wrong.
Does somebody have an idea, what could be wrong ?
Thanks for help
BR
Thomas

import keras
import tensorflow.compat.v1

from keras.models import Sequential
from keras.layers import Dense, Dropout, Flatten, Conv2D, MaxPooling2D
from tensorflow.keras.layers import BatchNormalization

import numpy as np

!rm oxflower17*
!wget https://bit.ly/36QytdH -O oxflower17.npz

data = np.load('oxflower17.npz')
X = data['X']
Y = data['Y']

#import tflearn.datasets.oxflower17 as oxflower17
#X, Y = oxflower17.load_data(one_hot=True)

model = Sequential()

model.add(Conv2D(96, kernel_size=(11, 11), strides=(4, 4), activation='relu', input_shape=(224, 224, 3)))
model.add(MaxPooling2D(pool_size=(3, 3), strides=(2, 2)))
model.add(BatchNormalization())

model.add(Conv2D(256, kernel_size=(5, 5), activation='relu'))
model.add(MaxPooling2D(pool_size=(3, 3), strides=(2, 2)))
model.add(BatchNormalization())

model.add(Conv2D(256, kernel_size=(3, 3), activation='relu'))
model.add(Conv2D(384, kernel_size=(3, 3), activation='relu'))
model.add(Conv2D(384, kernel_size=(3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(3, 3), strides=(2, 2)))
model.add(BatchNormalization())

model.add(Flatten())
model.add(Dense(4096, activation='tanh'))
model.add(Dropout(0.5))
model.add(Dense(4096, activation='tanh'))
model.add(Dropout(0.5))

model.add(Dense(17, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X, Y, batch_size=64, epochs=100, verbose=1, validation_split=0.1, shuffle=True)

Train on 1224 samples, validate on 136 samples
2023-05-01 11:29:28.100050: W tensorflow/c/c_api.cc:300] Operation '{name:'training_4/Adam/conv2d_12/bias/v/Assign' id:3526 op device:{requested: '', assigned: ''} def:{{{node training_4/Adam/conv2d_12/bias/v/Assign}} = AssignVariableOp[_has_manual_control_dependencies=true, dtype=DT_FLOAT, validate_shape=false](training_4/Adam/conv2d_12/bias/v, training_4/Adam/conv2d_12/bias/v/Initializer/zeros)}}' was changed by setting attribute after it was run by a session. This mutation will have no effect, and will trigger an error in the future. Either don't modify nodes after running them or create a new session.
Epoch 1/100
1224/1224 [==============================] - ETA: 0s - loss: 5.0144 - acc: 0.1797
2023-05-01 11:29:56.126373: W tensorflow/c/c_api.cc:300] Operation '{name:'loss_2/mul' id:3001 op device:{requested: '', assigned: ''} def:{{{node loss_2/mul}} = Mul[T=DT_FLOAT, _has_manual_control_dependencies=true](loss_2/mul/x, loss_2/dense_8_loss/value)}}' was changed by setting attribute after it was run by a session. This mutation will have no effect, and will trigger an error in the future. Either don't modify nodes after running them or create a new session.
1224/1224 [==============================] - 29s 23ms/sample - loss: 5.0144 - acc: 0.1797 - val_loss: 10.0863 - val_acc: 0.0588
Epoch 2/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 3.2775 - acc: 0.2614 - val_loss: 5.5473 - val_acc: 0.0662
Epoch 3/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 2.6932 - acc: 0.3219 - val_loss: 6.2977 - val_acc: 0.1397
Epoch 4/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 2.4534 - acc: 0.3562 - val_loss: 2.6887 - val_acc: 0.3088
Epoch 5/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 2.3171 - acc: 0.3913 - val_loss: 3.6668 - val_acc: 0.2206
Epoch 6/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 2.4160 - acc: 0.3856 - val_loss: 3.5026 - val_acc: 0.2794
Epoch 7/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 2.3011 - acc: 0.4093 - val_loss: 4.1973 - val_acc: 0.1838
Epoch 8/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 2.0330 - acc: 0.4592 - val_loss: 3.0618 - val_acc: 0.3088
Epoch 9/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 2.1806 - acc: 0.4338 - val_loss: 2.5885 - val_acc: 0.3162
Epoch 10/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 2.2620 - acc: 0.4232 - val_loss: 5.1870 - val_acc: 0.1985
Epoch 11/100
1224/1224 [==============================] - 30s 25ms/sample - loss: 1.9249 - acc: 0.4861 - val_loss: 2.3710 - val_acc: 0.3676
Epoch 12/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 1.7639 - acc: 0.5163 - val_loss: 2.6669 - val_acc: 0.4338
Epoch 13/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 1.6767 - acc: 0.5498 - val_loss: 3.8777 - val_acc: 0.3088
Epoch 14/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.9236 - acc: 0.5106 - val_loss: 3.1937 - val_acc: 0.3529
Epoch 15/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.6463 - acc: 0.5564 - val_loss: 2.9450 - val_acc: 0.3971
Epoch 16/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 1.7758 - acc: 0.5507 - val_loss: 3.0172 - val_acc: 0.3676
Epoch 17/100
1224/1224 [==============================] - 30s 25ms/sample - loss: 1.6831 - acc: 0.5253 - val_loss: 3.9578 - val_acc: 0.3235
Epoch 18/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.4820 - acc: 0.5874 - val_loss: 3.1532 - val_acc: 0.4338
Epoch 19/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.4646 - acc: 0.6013 - val_loss: 2.7689 - val_acc: 0.4338
Epoch 20/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.2729 - acc: 0.6275 - val_loss: 3.3961 - val_acc: 0.3824
Epoch 21/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.1467 - acc: 0.6830 - val_loss: 2.8041 - val_acc: 0.4926
Epoch 22/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.3285 - acc: 0.6405 - val_loss: 2.1994 - val_acc: 0.4485
Epoch 23/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.1406 - acc: 0.6593 - val_loss: 6.7718 - val_acc: 0.1765
Epoch 24/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 1.0974 - acc: 0.6944 - val_loss: 2.9021 - val_acc: 0.4338
Epoch 25/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 1.2066 - acc: 0.6667 - val_loss: 3.7668 - val_acc: 0.3750
Epoch 26/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 1.0289 - acc: 0.7092 - val_loss: 2.9394 - val_acc: 0.5221
Epoch 27/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.1849 - acc: 0.6904 - val_loss: 2.5308 - val_acc: 0.5147
Epoch 28/100
1224/1224 [==============================] - 30s 25ms/sample - loss: 1.0187 - acc: 0.6969 - val_loss: 3.2923 - val_acc: 0.4265
Epoch 29/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.3484 - acc: 0.6569 - val_loss: 5.8530 - val_acc: 0.3750
Epoch 30/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 1.2807 - acc: 0.6528 - val_loss: 3.0352 - val_acc: 0.4926
Epoch 31/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.2681 - acc: 0.6904 - val_loss: 3.7156 - val_acc: 0.4338
Epoch 32/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 1.1128 - acc: 0.7042 - val_loss: 2.4365 - val_acc: 0.4706
Epoch 33/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 1.1916 - acc: 0.7034 - val_loss: 3.5872 - val_acc: 0.4412
Epoch 34/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.8485 - acc: 0.7475 - val_loss: 2.3067 - val_acc: 0.5662
Epoch 35/100
1224/1224 [==============================] - 34s 28ms/sample - loss: 0.7040 - acc: 0.7998 - val_loss: 2.5718 - val_acc: 0.6029
Epoch 36/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.7632 - acc: 0.7908 - val_loss: 2.0461 - val_acc: 0.5662
Epoch 37/100
1224/1224 [==============================] - 31s 26ms/sample - loss: 0.8698 - acc: 0.7647 - val_loss: 2.5574 - val_acc: 0.5662
Epoch 38/100
1224/1224 [==============================] - 35s 29ms/sample - loss: 0.7660 - acc: 0.7917 - val_loss: 2.2289 - val_acc: 0.6324
Epoch 39/100
1224/1224 [==============================] - 31s 26ms/sample - loss: 0.4683 - acc: 0.8554 - val_loss: 2.8544 - val_acc: 0.5662
Epoch 40/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.5265 - acc: 0.8472 - val_loss: 2.8920 - val_acc: 0.5882
Epoch 41/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.5529 - acc: 0.8423 - val_loss: 2.8437 - val_acc: 0.5515
Epoch 42/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.7684 - acc: 0.8015 - val_loss: 3.2597 - val_acc: 0.4926
Epoch 43/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.4530 - acc: 0.8611 - val_loss: 3.5480 - val_acc: 0.5147
Epoch 44/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.5590 - acc: 0.8391 - val_loss: 3.4603 - val_acc: 0.5368
Epoch 45/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 1.1829 - acc: 0.7533 - val_loss: 4.1868 - val_acc: 0.4779
Epoch 46/100
1224/1224 [==============================] - 30s 25ms/sample - loss: 0.8601 - acc: 0.7802 - val_loss: 3.2483 - val_acc: 0.5294
Epoch 47/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 1.0455 - acc: 0.7753 - val_loss: 4.0359 - val_acc: 0.4485
Epoch 48/100
1224/1224 [==============================] - 34s 28ms/sample - loss: 0.5997 - acc: 0.8284 - val_loss: 3.3438 - val_acc: 0.4632
Epoch 49/100
1224/1224 [==============================] - 36s 29ms/sample - loss: 0.5014 - acc: 0.8636 - val_loss: 3.3794 - val_acc: 0.5588
Epoch 50/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 0.5341 - acc: 0.8415 - val_loss: 3.1851 - val_acc: 0.5441
Epoch 51/100
1224/1224 [==============================] - 34s 28ms/sample - loss: 0.5075 - acc: 0.8570 - val_loss: 3.5522 - val_acc: 0.5662
Epoch 52/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 0.3198 - acc: 0.9011 - val_loss: 2.8503 - val_acc: 0.6250
Epoch 53/100
1224/1224 [==============================] - 34s 27ms/sample - loss: 0.4718 - acc: 0.8775 - val_loss: 3.4969 - val_acc: 0.5368
Epoch 54/100
1224/1224 [==============================] - 37s 30ms/sample - loss: 0.2778 - acc: 0.9142 - val_loss: 2.2732 - val_acc: 0.6618
Epoch 55/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 0.3239 - acc: 0.9109 - val_loss: 2.8575 - val_acc: 0.6029
Epoch 56/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.2875 - acc: 0.9142 - val_loss: 3.0224 - val_acc: 0.6324
Epoch 57/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.2683 - acc: 0.9191 - val_loss: 3.2735 - val_acc: 0.6176
Epoch 58/100
1224/1224 [==============================] - 30s 25ms/sample - loss: 0.3201 - acc: 0.9167 - val_loss: 3.2597 - val_acc: 0.6176
Epoch 59/100
1224/1224 [==============================] - 31s 26ms/sample - loss: 0.2960 - acc: 0.9150 - val_loss: 2.7775 - val_acc: 0.6250
Epoch 60/100
1224/1224 [==============================] - 34s 28ms/sample - loss: 0.4481 - acc: 0.9069 - val_loss: 4.2904 - val_acc: 0.5662
Epoch 61/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.7816 - acc: 0.8342 - val_loss: 3.3295 - val_acc: 0.5368
Epoch 62/100
1224/1224 [==============================] - 30s 25ms/sample - loss: 0.5890 - acc: 0.8538 - val_loss: 4.1752 - val_acc: 0.5074
Epoch 63/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.4552 - acc: 0.8824 - val_loss: 2.8554 - val_acc: 0.6103
Epoch 64/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.2289 - acc: 0.9273 - val_loss: 2.8480 - val_acc: 0.6912
Epoch 65/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 0.1800 - acc: 0.9428 - val_loss: 3.6724 - val_acc: 0.5882
Epoch 66/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 0.7502 - acc: 0.8342 - val_loss: 3.2567 - val_acc: 0.6029
Epoch 67/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 0.4260 - acc: 0.8766 - val_loss: 2.8419 - val_acc: 0.6324
Epoch 68/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 0.2063 - acc: 0.9355 - val_loss: 3.3383 - val_acc: 0.5882
Epoch 69/100
1224/1224 [==============================] - 36s 29ms/sample - loss: 0.1641 - acc: 0.9485 - val_loss: 2.8249 - val_acc: 0.6691
Epoch 70/100
1224/1224 [==============================] - 35s 28ms/sample - loss: 0.1430 - acc: 0.9567 - val_loss: 3.3131 - val_acc: 0.6397
Epoch 71/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 0.4220 - acc: 0.9101 - val_loss: 3.6871 - val_acc: 0.5882
Epoch 72/100
1224/1224 [==============================] - 33s 27ms/sample - loss: 0.1828 - acc: 0.9493 - val_loss: 3.9111 - val_acc: 0.5515
Epoch 73/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.2159 - acc: 0.9387 - val_loss: 3.7375 - val_acc: 0.5735
Epoch 74/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.1960 - acc: 0.9436 - val_loss: 4.8723 - val_acc: 0.5368
Epoch 75/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2561 - acc: 0.9314 - val_loss: 3.0616 - val_acc: 0.6471
Epoch 76/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2306 - acc: 0.9428 - val_loss: 3.3835 - val_acc: 0.6691
Epoch 77/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2370 - acc: 0.9412 - val_loss: 3.3218 - val_acc: 0.6985
Epoch 78/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2234 - acc: 0.9428 - val_loss: 3.1847 - val_acc: 0.6691
Epoch 79/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2291 - acc: 0.9420 - val_loss: 4.4569 - val_acc: 0.6250
Epoch 80/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.5044 - acc: 0.9036 - val_loss: 4.6910 - val_acc: 0.5588
Epoch 81/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2572 - acc: 0.9338 - val_loss: 3.9741 - val_acc: 0.6103
Epoch 82/100
1224/1224 [==============================] - 32s 26ms/sample - loss: 0.2419 - acc: 0.9412 - val_loss: 3.5022 - val_acc: 0.6250
Epoch 83/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.1991 - acc: 0.9469 - val_loss: 3.2057 - val_acc: 0.6618
Epoch 84/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 0.1001 - acc: 0.9706 - val_loss: 3.3824 - val_acc: 0.6912
Epoch 85/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.1351 - acc: 0.9681 - val_loss: 3.4742 - val_acc: 0.6765
Epoch 86/100
1224/1224 [==============================] - 31s 25ms/sample - loss: 0.3279 - acc: 0.9322 - val_loss: 3.4699 - val_acc: 0.6397
Epoch 87/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.4962 - acc: 0.8954 - val_loss: 4.7173 - val_acc: 0.5368
Epoch 88/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2327 - acc: 0.9371 - val_loss: 2.9077 - val_acc: 0.6250
Epoch 89/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2620 - acc: 0.9355 - val_loss: 3.6844 - val_acc: 0.6618
Epoch 90/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.1095 - acc: 0.9624 - val_loss: 3.5161 - val_acc: 0.6618
Epoch 91/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.1265 - acc: 0.9665 - val_loss: 3.7706 - val_acc: 0.6250
Epoch 92/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.0984 - acc: 0.9673 - val_loss: 3.7893 - val_acc: 0.6544
Epoch 93/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 0.1135 - acc: 0.9657 - val_loss: 3.6777 - val_acc: 0.6691
Epoch 94/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.0455 - acc: 0.9894 - val_loss: 4.2340 - val_acc: 0.6103
Epoch 95/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.4316 - acc: 0.9265 - val_loss: 3.9547 - val_acc: 0.5515
Epoch 96/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.1366 - acc: 0.9608 - val_loss: 3.8538 - val_acc: 0.6324
Epoch 97/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.3379 - acc: 0.9297 - val_loss: 4.1262 - val_acc: 0.6250
Epoch 98/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.3417 - acc: 0.9289 - val_loss: 6.9736 - val_acc: 0.3676
Epoch 99/100
1224/1224 [==============================] - 29s 24ms/sample - loss: 0.2646 - acc: 0.9338 - val_loss: 6.9945 - val_acc: 0.4853
Epoch 100/100
1224/1224 [==============================] - 30s 24ms/sample - loss: 0.5726 - acc: 0.8881 - val_loss: 5.9798 - val_acc: 0.5147
<keras.callbacks.History at 0x7f1b1ca7b160>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant