python - tflearn label encoding with large number of classes -
i trying adapt convolutional neural net example of tflearn classification ~12000 distinct class labels , more 1 million training examples. number of labels apparently problem in terms of memory consumption when one-hot encoding them. first map string labels continuous integers, pass these list to_categorical()
function. following code leads memoryerror:
trainy = to_categorical(trainy, nb_classes=n_classes)
do have encode labels or should use different loss function cross-entropy? can train in batches tflearn - can pass generator dnn.fit()
function?
thanks advice!
in regression layer link, can specify labels feed in should one-hot encoded on run
tflearn.layers.regression(incoming_net, loss = 'categorical_crossentropy', batch_size = 64, to_one_hot = true, n_classes = 12000)
in way should not have memory error, because labels encoded in batches while training.
Comments
Post a Comment