I am trying to build a fairly simple autoencoder using Keras on the OpenImages dataset. Here is the architecture of the ae:
Layer (type) Output Shape Param #
=================================================================
conv3d_1 (SeparableConv2D) (None, 64, 64, 64) 283
_________________________________________________________________
max_pool_1 (MaxPooling2D) (None, 32, 32, 64) 0
_________________________________________________________________
batch_norm_1 (BatchNormaliza (None, 32, 32, 64) 256
_________________________________________________________________
sep_conv2d_2 (SeparableConv2 (None, 32, 32, 32) 2656
_________________________________________________________________
max_pool_2 (MaxPooling2D) (None, 16, 16, 32) 0
_________________________________________________________________
batch_norm_2 (BatchNormaliza (None, 16, 16, 32) 128
_________________________________________________________________
sep_conv2d_3 (SeparableConv2 (None, 16, 16, 32) 1344
_________________________________________________________________
max_pool_3 (MaxPooling2D) (None, 8, 8, 32) 0
_________________________________________________________________
batch_norm_3 (BatchNormaliza (None, 8, 8, 32) 128
_________________________________________________________________
flatten (Flatten) (None, 2048) 0
_________________________________________________________________
bottleneck (Dense) (None, 64) 131136
_________________________________________________________________
reshape (Reshape) (None, 8, 8, 1) 0
_________________________________________________________________
conv_2d_transpose_1 (Conv2DT (None, 16, 16, 32) 320
_________________________________________________________________
batch_norm_4 (BatchNormaliza (None, 16, 16, 32) 128
_________________________________________________________________
conv_2d_transpose_2 (Conv2DT (None, 32, 32, 32) 9248
_________________________________________________________________
batch_norm_5 (BatchNormaliza (None, 32, 32, 32) 128
_________________________________________________________________
conv_2d_transpose_3 (Conv2DT (None, 64, 64, 64) 18496
_________________________________________________________________
batch_norm_6 (BatchNormaliza (None, 64, 64, 64) 256
_________________________________________________________________
sep_conv2d_4 (SeparableConv2 (None, 64, 64, 3) 771
=================================================================
Total params: 165,278
Trainable params: 164,766
Non-trainable params: 512
I am then defining generators that flow from a directory where I have downloaded the images:
train_data_dir = 'open_images/train/'
validation_data_dir = 'open_images/validation/'
batch_size = 128
train_datagen = ImageDataGenerator(rescale=1./255)
test_datagen = ImageDataGenerator(rescale=1./255)
train_generator = train_datagen.flow_from_directory(
train_data_dir,
target_size=(64, 64),
batch_size=batch_size,
class_mode=None)
validation_generator = test_datagen.flow_from_directory(
validation_data_dir,
target_size=(64, 64),
batch_size=batch_size,
class_mode=None)
And here is the model build step:
def fixed_generator(generator):
for batch in generator:
yield (batch, batch)
num_epochs = 10
steps_per_epoch = 120
autoencoder.fit_generator(
fixed_generator(train_generator),
steps_per_epoch=steps_per_epoch,
epochs=num_epochs,
validation_data=fixed_generator(validation_generator),
validation_steps=100
)
When I run this code it seems like something is going wrong with the validation step because it only returns NaN:
Epoch 1/10
120/120 [==============================] - 241s 2s/step - loss: 0.0468 - val_loss: nan
Epoch 2/10
120/120 [==============================] - 239s 2s/step - loss: 0.0278 - val_loss: nan
Epoch 3/10
120/120 [==============================] - 240s 2s/step - loss: 0.0248 - val_loss: nan
Epoch 4/10
120/120 [==============================] - 241s 2s/step - loss: 0.0234 - val_loss: nan
Epoch 5/10
120/120 [==============================] - 240s 2s/step - loss: 0.0226 - val_loss: nan
Epoch 6/10
120/120 [==============================] - 241s 2s/step - loss: 0.0221 - val_loss: nan
Epoch 7/10
120/120 [==============================] - 242s 2s/step - loss: 0.0217 - val_loss: nan
Epoch 8/10
120/120 [==============================] - 240s 2s/step - loss: 0.0213 - val_loss: nan
Epoch 9/10
120/120 [==============================] - 240s 2s/step - loss: 0.0210 - val_loss: nan
Epoch 10/10
120/120 [==============================] - 242s 2s/step - loss: 0.0207 - val_loss: nan
Also when the validation generator code is run it prints:
Found 0 images belonging to 0 classes.
There are definitely images in that directory though. Any idea what might be going on?
Edit: If you want to be convinced there are images in the folder...
ubuntu@ip-172-16-1-35:~$ ls -l open_images/validation/ | head
total 12661044
-rw-r--r-- 1 ubuntu ubuntu 290621 Jul 10 2018 0001eeaf4aed83f9.jpg
-rw-r--r-- 1 ubuntu ubuntu 375363 Jul 10 2018 0004886b7d043cfd.jpg
-rw-r--r-- 1 ubuntu ubuntu 462817 Jul 10 2018 000595fe6fee6369.jpg
-rw-r--r-- 1 ubuntu ubuntu 302326 Jul 10 2018 00075905539074f2.jpg
-rw-r--r-- 1 ubuntu ubuntu 970275 Jul 10 2018 0007cebe1b2ba653.jpg
-rw-r--r-- 1 ubuntu ubuntu 614095 Jul 10 2018 0007d6cf88afaa4a.jpg
-rw-r--r-- 1 ubuntu ubuntu 415082 Jul 10 2018 0008e425fb49a2bf.jpg
-rw-r--r-- 1 ubuntu ubuntu 359851 Jul 10 2018 0009bad4d8539bb4.jpg
-rw-r--r-- 1 ubuntu ubuntu 186452 Jul 10 2018 000a045a0715d64d.jpg