gsamaras gsamaras - 1 year ago 631
Python Question

Deep learning Udacity course: Prob 2 assignment 1 (notMNIST)

After reading this and taking the courses, I am struggling to solve the second problem in assignment 1 (notMnist):

Let's verify that the data still looks good. Displaying a sample of the labels and images from the ndarray. Hint: you can use matplotlib.pyplot.

Here is what I tried:

import random
rand_smpl = [ train_datasets[i] for i in sorted(random.sample(xrange(len(train_datasets)), 1)) ]
filename = rand_smpl[0]
import pickle
loaded_pickle = pickle.load( open( filename, "r" ) )
image_size = 28 # Pixel width and height.
import numpy as np
dataset = np.ndarray(shape=(len(loaded_pickle), image_size, image_size),
import matplotlib.pyplot as plt

plt.ylabel('some numbers')

but this is what I get:

enter image description here

which doesn't make much sense. To be honest my code may too, since I am not really sure how to tackle that problem!

The pickles are created like this:

image_size = 28 # Pixel width and height.
pixel_depth = 255.0 # Number of levels per pixel.

def load_letter(folder, min_num_images):
"""Load the data for a single letter label."""
image_files = os.listdir(folder)
dataset = np.ndarray(shape=(len(image_files), image_size, image_size),
num_images = 0
for image in image_files:
image_file = os.path.join(folder, image)
image_data = (ndimage.imread(image_file).astype(float) -
pixel_depth / 2) / pixel_depth
if image_data.shape != (image_size, image_size):
raise Exception('Unexpected image shape: %s' % str(image_data.shape))
dataset[num_images, :, :] = image_data
num_images = num_images + 1
except IOError as e:
print('Could not read:', image_file, ':', e, '- it\'s ok, skipping.')

dataset = dataset[0:num_images, :, :]
if num_images < min_num_images:
raise Exception('Many fewer images than expected: %d < %d' %
(num_images, min_num_images))

print('Full dataset tensor:', dataset.shape)
print('Mean:', np.mean(dataset))
print('Standard deviation:', np.std(dataset))
return dataset

where that function is called like this:

dataset = load_letter(folder, min_num_images_per_class)
with open(set_filename, 'wb') as f:
pickle.dump(dataset, f, pickle.HIGHEST_PROTOCOL)

The idea here is:

Now let's load the data in a more manageable format. Since, depending on your computer setup you might not be able to fit it all in memory, we'll load each class into a separate dataset, store them on disk and curate them independently. Later we'll merge them into a single dataset of manageable size.

We'll convert the entire dataset into a 3D array (image index, x, y) of floating point values, normalized to have approximately zero mean and standard deviation ~0.5 to make training easier down the road.

Answer Source

Do this as below:

#define a function to conver label to letter
def letter(i):
    return 'abcdefghij'[i]

# you need a matplotlib inline to be able to show images in python notebook
%matplotlib inline
plt.title("Char " + letter(train_labels[10]))

Your code changed the type of dataset actually, it is not an ndarray of size (220000, 28,28)

In general, pickle is a file which holds some objects, not the array itself. You should use the object from pickle directly to get your train dataset (using the notation from your code snippet):

#will give you train_dataset and labels
train_dataset = loaded_pickle['train_dataset']
train_labels = loaded_pickle['train_labels']