I need to upload a lot of small images. Each letter of the alphabet contains about 2000 images, each 145x145 (each weighing about 400 bytes). Each image is loaded as follows:
from PIL import Image im = Image.open(filepath).convert('L') (width, height) = im.size greyscale_map = list(im.getdata()) greyscale_map = np.array(greyscale_map) greyscale_map = greyscale_map.reshape((height, width)) As a result, out of 14 MB of images, I get 14 GB of .npy objects. Can I do something with the size? Maybe somehow different download? In the future, these data will need to pass through a neural network.
sparse matrix(csr_matrix)? - MaxU